Document 7603456

Download Report

Transcript Document 7603456

Do software agents know
what they talk about?
Agents and Ontology
dr. Patrick De Causmaecker,
Nottingham, March 7-11 2005
Deductive reasoning agents
Logical programming




First order logic
Example: Prolog
Example: Rule based systems
Example: Constraint Satisfaction
Nottingham, March 2005
Agents and Ontology
[email protected]
3
First order logic



Predicates on atoms, not on predicates.
Quantifiers relate atoms
Grelling’s paradox (cannot be expressed in
first order logic)
If an adjective truly describes itself, call it “autological",
otherwise call it "heterological". For example, "polysyllabic" and
"English" are autological, while "monosyllabic" and "pulchritudinous"
are heterological. Is "heterological" heterological? If it is, then it
isn't; if it isn't, then it is.”
Nottingham, March 2005
Agents and Ontology
[email protected]
4
Example: Prolog

http://www.ugosweb.com/jiprolog/
Nottingham, March 2005
Agents and Ontology
[email protected]
5
father(terach,abraham).
father(terach,nachor).
father(terach,haran).
father(abraham,isaac).
father(haran,lot):-!.
father(haran,milcah).
mother(sara,isaac).
male(terach).
male(abraham).
male(nachor).
male(haran).
male(isaac).
male(lot).
female(sarah).
female(milcah).
female(yiscah).
likes(X,pome).
son(X,Y):-father(Y,X),male(X).
daughter(X,Z):-father(Z,X),female(X).
granfather(X,Z):-father(X,Y),father(Y,Z).
Nottingham, March 2005
Agents and Ontology
Patrick.DeCausmaecker@kahosl.
6
Towers of Hannoi
hanoi(1, A,B,C,[[A,B]]):-!.
hanoi(N, A,B,C,Moves):N1 is N - 1,
hanoi(N1, A,C,B,Ms1),
hanoi(N1, C,B,A,Ms2),
append(Ms1, [[A,B]|Ms2], Moves),
!.
Nottingham, March 2005
Agents and Ontology
[email protected]
7
Example: Rulebased systems
http://www.expertise2go.com/download/demo
.html
Nottingham, March 2005
Agents and Ontology
[email protected]
8
RULE [Is the battery dead?]
If [the result of switching on the headlights] = "nothing
happens" or
[the result of trying the starter] = "nothing happens"
Then [the recommended action] = "recharge or replace the
battery"
RULE [Is the car out of gas?]
If [the gas tank] = "empty"
Then [the recommended action] = "refuel the car"
Nottingham, March 2005
Agents and Ontology
[email protected]
9
RULE [Is the battery weak?]
If [the result of trying the starter] : "the car cranks
slowly" "the car cranks normally" and
[the headlights dim when trying the starter] = true and
[the amount you are willing to spend on repairs] >
24.99
Then [the recommended action] = "recharge or replace
the battery"
Nottingham, March 2005
Agents and Ontology
[email protected]
10
RULE [Is the car flooded?]
If [the result of trying the starter] = "the car cranks normally"
and
[a gas smell] = "present when trying the starter"
Then [the recommended action] = "wait 10 minutes, then restart
flooded car"
Nottingham, March 2005
Agents and Ontology
[email protected]
11
RULE [Is the gas tank empty?]
If [the result of trying the starter] = "the car cranks normally"
and
[a gas smell] = "not present when trying the starter"
Then [the gas tank] = "empty" @ 90
Nottingham, March 2005
Agents and Ontology
[email protected]
12
PROMPT [the result of trying the starter] Choice CF
"What happens when you turn the key to try to start the car?"
"the car cranks normally"
"the car cranks slowly"
"nothing happens"
Nottingham, March 2005
Agents and Ontology
[email protected]
13
PROMPT [a gas smell] MultChoice CF
"The smell of gasoline is:"
"present when trying the starter"
"not present when trying the starter"
Nottingham, March 2005
Agents and Ontology
[email protected]
14
PROMPT [the result of switching on the headlights] MultChoice
CF
"The result of switching on the headlights is:"
"they light up"
"nothing happens"
PROMPT [the headlights dim when trying the starter] YesNo CF
"Do the headlights dim when you try the starter with the lights
on?"
Nottingham, March 2005
Agents and Ontology
[email protected]
15
Example: Constraint
Satisfaction
http://kti.ms.mff.cuni.cz/~bartak/constraints/index.html
Nottingham, March 2005
Agents and Ontology
[email protected]
16
Nottingham, March 2005
Agents and Ontology
Patrick.DeCausmaecker@kahosl.
17
Nottingham, March 2005
Agents and Ontology
Patrick.DeCausmaecker@kahosl.
18
Nottingham, March 2005
Agents and Ontology
Patrick.DeCausmaecker@kahosl.
19
Nottingham, March 2005
Agents and Ontology
Patrick.DeCausmaecker@kahosl.
20
Deductive reasoning


Intelligent behaviour can be reached by
providing the system with a symbolic
representation of its environment and
allow it to manipulate this
representation syntactically
The symbolic representation is a set of
logical formulas. The manipulation is
deduction, or theorem proving.
Nottingham, March 2005
Agents and Ontology
[email protected]
21
Interp:
Pixel manipulation
Knowledge bank: belief:
dist(mij,d1) = 90 cm
door(d1)
D020
Plan
STOP
Action
BREAK!
Nottingham, March 2005
Agents and Ontology
Patrick.DeCausmaecker@kahosl.
22
Two problems

Transduction


Sufficiantly fast transformation of
observations in an adequate symbolic
representation.
Representation/reasoning

The symbolic representation as a basis for
the manipulation process. Both should be
sufficiently fast.
Nottingham, March 2005
Agents and Ontology
[email protected]
23
AI aproach

Perception:


Representation


Vision, speach, natural language,
learning,…
Knowledge representation tasks, automatic
reasoning, automatic planning
A lot of work has been done, results are
still very limited.
Nottingham, March 2005
Agents and Ontology
[email protected]
24
Agents as theorem provers

The internal state of the agent is a
database of first order predicates:
Open(valve221)
Temperature(reactor4726,321)
Pressure(tank776,28)

This database contains all beliefs of the
agent.
Nottingham, March 2005
Agents and Ontology
[email protected]
25
Agents as theorem provers



Beliefs are not exact, complete.
Interpretation may be faulty.
Still these predicates are all the agent
can walk on.
Nottingham, March 2005
Agents and Ontology
[email protected]
26
Agents as theorem provers

Formally
L = {all first-order predikaten}
D = (L) = {all L databases}
, 1, 2,…  D
= {deductionrules of the agent}
  means that formula  from L can be proven
from database  using rules .
Nottingham, March 2005
Agents and Ontology
[email protected]
27
Agents as theorem provers

The agent:

The perception function:


The adaptation of the internal state:


see : S -> Per
next : D  Per -> D
The action function:

action : D -> Ac
Nottingham, March 2005
Agents and Ontology
[email protected]
28
Function Action by proof
1. Function action( :D) return een actie Ac
2. begin
3. for each   Ac
4.
if   Do() then return 
5. end for
6. for each   Ac
7.
if   Do() then return 
8. end for
9. return null
10. end
Nottingham, March 2005
Agents and Ontology
[email protected]
29
Example: the vacuum cleaning agent
Nottingham, March 2005
Agents and Ontology
[email protected]
30
Vacuum cleaning

The world
In(x,y)
Dirt(x,y)
Facing(d)

Previous information changes
old() = {P(t1,…,tn) |P  {In,Dirt,Facing} en P(t1,…,tn)  }}
Nottingham, March 2005
Agents and Ontology
[email protected]
31
Vacuum cleaning

The function new generates new
knowledge:


new : D  Per -> D (exercise)
One can define next as:

next(,p) = ( \old())  new(,p)
Nottingham, March 2005
Agents and Ontology
[email protected]
32
Vacuum cleaning

Deductionrules are as



“If  is consistent with the content of the
database, conclude ”
Rule 1: arbeit


(…)  (…)
In(x,y)  Dirt(x,y)  Do(suck)
Rule 2:bewegen




In(0,0)  Facing(north)   Dirt(0,0)  Do(forward)
In(0,1)  Facing(north)   Dirt(0,0)  Do(forward)
In(0,2)  Facing(north)   Dirt(0,0)  Do(turn)
In(0,2)  Facing(east)  Do(forward)
Nottingham, March 2005
Agents and Ontology
[email protected]
33
Conclusions





Rather impractical…
Agent must try do determine its optimal
action by reasoning.
This takes time (deductive systems are slow).
The world can have changed…
“calculative rationality”: agent decides for the
optimal action at the time of the start of the
reasoning process.

Not allways acceptable
Nottingham, March 2005
Agents and Ontology
[email protected]
34
Other problems



Logic is elegant but slow
The see functie is in a difficult, poorly
understood, sector of AI.
The vacuum cleaning problem was
already difficult to describe!
Nottingham, March 2005
Agents and Ontology
[email protected]
35
Agent georiënterd programming:
Agent0 (Shoham 1993)


Desire, belief, intention
In Agent0 an agent is




capabilities,
Initial beliefs
Initial commitments
Rules to deduct commitments (commitment rules).
Nottingham, March 2005
Agents and Ontology
[email protected]
36
Agent0

A commitment rule is

A message condition


A mentale condition


To be compared with received messages
To be compared with the beliefs and intentions
An action actie

To be selected if appropriate
Nottingham, March 2005
Agents and Ontology
[email protected]
37
Agent0

Two kinds of actions:



Communicative
Private
Three kinds of messages:



Requests for action
Unrequests to stop action
Inform for infomation
Nottingham, March 2005
Agents and Ontology
[email protected]
38
COMMIT(
(agent, REQUEST, DO(time, action)) ;;; boodschapvoorwaarde
(B,[now, Friend agent] AND CAN(self, action)
AND NOT [time, CMT(self, anyaction)]), ;;; mentale voorwaarde
self, DO(time,action)
)
Nottingham, March 2005
Agents and Ontology
[email protected]
39
messages in
Initialize
Beliefs
Update beliefs
Commitments
Update commitments
Abilities
Execute
internalNottingham,
actions
March 2005
messages
out
Agents and
Ontology
[email protected]
40