Artificial Intelligence CS 165A Thursday, November 15, 2007  Knowledge Representation (Ch 10)

Download Report

Transcript Artificial Intelligence CS 165A Thursday, November 15, 2007  Knowledge Representation (Ch 10)

Artificial Intelligence
CS 165A
Thursday, November 15, 2007
 Knowledge Representation (Ch 10)
1
Notes
• HW assignments
– HW#4 due Wednesday (11/21), HW#5 due 12/4
• Schedule
– Three weeks left
– Knowledge representation (Ch. 10), probabilistic reasoning (Ch.
13, 14)
– What else?
 Planning
 Perception (speech, language, vision)
 Robotics
 Examples of AI research and/or applications
• Precedence of AND, OR
2
Correction to 10/30 Lecture notes, slide #9
Precedence of operators (logical connectives)
•
Levels of precedence, evaluating left to right
1.  (NOT)
2.  (AND, conjunction)
3.  (OR, disjunction)
4.  (implies, conditional)
5.  (equivalence, biconditional)
•
PQR
– (P  (Q))  R
•
PQR
– P  (Q  R)
•
PQ  RS
– P  ((Q  R)  S)
3
Forward and Backward Chaining
• Forward chaining
– Data driven or data directed
– New version of TELL(KB, p)
 Add the sentence p, then apply inference rules to the updated
KB until no more rules apply (“chaining” – “chain reaction”)
• Backward chaining
–
–
–
–
What form would you want your KB to be
in to best support backward chaining?
Goal oriented
ASK(KB, q)
If SUBST(, q) is in KB, return q' = SUBST(, q)
Else, find implication sentences p  q then set p as a subgoals
 Keep doing this, working “backwards”
 If p is not in KB, look for r  p, then set r as a subgoal
 Etc…..
– Backward chaining is the basis for logic programming (e.g., Prolog)
4
Forward chaining example
KB = { }
1. TELL(KB, Buffalo(x)  Pig(y)  Outrun(x,y))
2. TELL(KB, Pig(x)  Slug(y)  Outrun(x,y))
3. TELL(KB, Outrun(x,y)  Outrun(y,z)  Outrun(x,z))
4. TELL(KB, Buffalo(Bob))
5. TELL(KB, Pig(Pat))
6. TELL(KB, Slug(Steve))
What happens at every step?
5
Backward chaining example
KB:
Pig(y)  Slug(z)  Faster(y,z)
Slimy(z)  Creeps(z)  Slug(z)
Pig(Pat)
Slimy(Steve)
Creeps(Steve)
ASK(KB, Faster(Pat, Steve))
q
Is q in KB? No
So look for p  q
6
At this point we have…
• A powerful logic (FOL) in which we can express many or
most things of interest
• Two powerful inference rules and their normal forms
– Generalized Modus Ponens
– Generalized Resolution
 Both use unification
• Ways to convert any FOL sentences (a KB) into normal
form (CNF or INF)
• Inference strategies: data-driven and goal-driven
– Forward chaining
– Backward chaining
• Search methods and a problem formulation method
7
We have AI, more or less
• We can now build rational agents that receive percepts,
reason about their world and implicit goals, and act upon
their world
– Problem-solving agents
• We could also consider how to set goals and subgoals for
our agents; how to construct and execute plans that achieve
the agent’s goals
– Planning agents

Not covering in this course
8
Applications of logical reasoning systems
• Logical reasoning systems often referred to as
– Knowledge-based systems
– Rule-based systems
• Two common kinds of reasoning systems
– Expert systems
Knowledge-based systems
– Production systems
Expert
systems
Production
systems
9
Expert systems
• Expert system: a computer program embodying knowledge
and ability of expert in task domain
– Built with the help and guidance of human experts
– Seek to perform as well as or better than human experts on specific
tasks
– Historically rule-based (but less so now – could be probabilistic)
– Many in use in business, science, engineering, and the military
– Basic underlying theory: Horn KBs with resolution refutation
• Some examples
– Medical diagnosis (MYCIN)
– Science (DENDRAL –
chemical spectral analysis)
– Mathematics theorem proving
– Geological exploration
–
–
–
–
–
System repair
VLSI chip layout
Help desk
Computer system configuration
Chess
10
Production systems
• A production is a condition-action rule
– An “if-then” rule: if condition then action is valid
• A production system is a knowledge-based system that
uses productions to match the state of the KB with
applicable actions
– p  q  action1
– r  action2
• Aspects of a production system
–
–
–
–
ADD(KB, p) with forward chaining
Match phase
Conflict resolution phase
Choice of action
Perceive
Reason
Act
11
Production systems (cont.)
• Match phase: Which rules have left-hand sides
(“conditions”) that match the KB?
– Rules (productions) are stored separately from knowledge
– May have heuristic rules for ordering rule application
• Conflict resolution phase: Which of the matching rules
should be executed?
– I.e., which applicable action should be taken?
• Notes
– Examples of early system: R1 for configuring VAX 780s
– Led to early versions of expert systems
12
Limitations of formal logic
• Formal logic sometimes doesn’t perform well in the real world
–
–
–
–
“brittle” if have (hidden) contradiction in KB
Most categories have fuzzy boundaries
Most rules have exceptions
Cause/effect is usually not completely straightforward
• Examples:
– Modus ponens and science
– Medical diagnosis
PQ
Q
If a patient has the flu, the patient will have a fever.
The patient has a fever. What can be concluded?
• Issues:
–
–
–
–
–
Answer: Logically, nothing
Handling of arbitrary logical expressions
Complex semantics
Uncertainty
Computational efficiency
Human interaction
13
Logic and uncertainty
• For example, consider the rule “Raining(t)  WetGrass(t)”
– Is this always true
 Can [T,F] be applicable in real world?
 What if it’s not raining?
– We’d rather know the full relationship between Raining and
WetGrass
• FOL deals with all, not all, some, none
– The real world is not so simple
• Complexity is often manifested as uncertainty
– Rather than a very large number of rules that cover every case, we
may have a few rules that capture most of the cases
– This may be a result of ignorance or laziness
• We need ways to reason about or in the presence of
uncertainty [coming soon, Ch. 13 and 14]
14
Knowledge
• We’ve also mostly finessed the issues regarding what
knowledge to represent and how
– What is the domain of objects?
– How do we represent more complex knowledge than simple
predicates?
– We can describe object properties and relations between objects,
but how to describe actions, situations, and events?
– What if the state of the world changes?
– Can we reason about categories of objects?
• Chapter 10 raises these kinds of issues under the topic of
knowledge representation
15
Thursday quiz
Give an English description of the following sentence in FOL
using situation calculus:
 x, s Studying(x, s)  Failed(x, Result(TakeTest, s))
16
Knowledge engineering
• The process of knowledge base construction (either
special-purpose or general-purpose KBs) is called
knowledge engineering
• The knowledge engineering process:
–
–
–
–
–
–
–
Identify the task
Assemble the relevant knowledge
Decide on a vocabulary of predicated, functions, and constants
Encode general knowledge about the domain
Encode a description of the specific problem instance
Pose queries to the inference procedure and get answers
Debug the knowledge base
See the electronic circuits domain example in Section 8.4
17
Ontological engineering
• Ontology – a theory of the nature of being or existence
– What exists, what can be known (in a particular domain)?
• Ontological engineering builds a formal ontology for a
particular domain
– Defines categories and their relations (e.g., inheritance)
 Taxonomy of categories and subcategories
– Defines/limits what can possibly be stated, and reasoned about, in
the domain
• Would like to reason about actions, events, and situations
– Need a way to efficiently consider time, or time sequences
18
Reification
• Category  Object
– Can represent basketballs using the predicate Basketball(x) or by
reifying the category as an object, Basketballs
 Member (x, Basketballs)
– Object x is a member of the category Basketballs
 Subset(Basketballs, Balls)
– Basketballs is a subcategory of Balls
– Class properties apply to specific class objects
 x must be spherical, must bounce, etc.
– “isa” (member or subclass) relationship
 subcategory isa category
19
Semantic networks
• A semantic network is a directed graph consisting of
– Vertices – representing concepts
– Edges – representing semantic relations between the concepts
• “Visual logic”
20
The Frame Problem
• The frame problem is the problem of expressing a dynamic
domain in logic without explicitly specifying which
conditions are not affected by an action
– If the light is on at time t1, do we have to explicitly say it’s on at
time t2?
– This can become very burdensome....
• The frame problem can be addressed (partially) with
– Including situations (Si) that describe the state of the environment
at particular points in time
 S0  S1  S2  ...
 Doesn’t have to represent equally spaced units of time
– Representing the results of actions from Si to Sj
– Having an ontology of time
21
Situation Calculus – actions, events
• “Situation Calculus” is a way of describing change over
time in first-order logic
– Fluents: Functions or predicates that can vary over time have an
extra argument, Si (the situation argument)
 Predicate(args, Si)
 Location of an agent, aliveness, changing properties, ...
– The Result function is used to represent change from one situation
to another resulting from an action (or action sequence)
 Result(GoForward, Si) = Sj
 “Sj is the situation that results from the action GoForward
applied to situation Si
 Result() indicates the relationship between situations
22
Situation Calculus
23
Represents the world in different “situations” and the relationship between situations
Situation Calculus
24
Represents the world in different “situations” and the relationship between situations
Examples
• How would you interpret the following sentences in FirstOrder Logic using situation calculus?
 x, s Studying(x, s)  Failed(x, Result(TakeTest, s))
If you’re studying and then you take the test, you will fail.
(or) Studying a subject implies that you will fail the test for that subject.
 x, s TurnedOn(x, s)  LightSwitch(x)  TurnedOff(x,
Result(FlipSwitch, s))
If you flip the light switch when it is turned on, it will then be turned off.
25