M263: Building Blocks of Software

Download Report

Transcript M263: Building Blocks of Software

1
Block II, Unit III, Symbolic AI in the
world
This unit has four main sections
 Planning
 Robots
 Learning adaptation and heuristics
 Uncertainty
2
Block II, Unit III, Symbolic AI in the
world
Planning
 Planning might appear to be just another form of
problem solving.

In Symbolic AI, problem solving consists of setting a
system to an initial state, defining a goal state and then
defining all of the possible actions our system can take.

The system will search through the space of possible
states looking for a solution.
3
Block II, Unit III, Symbolic AI in the
world
Planning
 To take a simple example, consider solving the problem
of buying apples from a shop.

The initial state is being at home with no apples, the
goal state is being back at home with some apples.

Between the two lies a state space that may be
something like the one shown in following figure
4
Block II, Unit III, Symbolic AI in the
world
5
Block II, Unit III, Symbolic AI in the
world
Planning
This is an oversimplified picture of the problem:

In reality, each level of the tree must have thousands, if
not millions, of branches and the tree itself might have
hundreds of levels.

Exhaustive search of such a space is clearly infeasible,
so heuristic techniques have to been brought in to speed
up searches
A good heuristic would tell the system that shopping is
a good way of acquiring new items (including apples).

6
Block II, Unit III, Symbolic AI in the
world
Planning
 The search could then be directed along the shopping
branch.

A further heuristic might then guide the search towards
shops that sell fruit.

But a more serious difficulty is that it forces the system
to start either at the initial state or at the goal state and
work towards the other: the search program must
examine each of the initial actions before moving on to
the next.
7
Block II, Unit III, Symbolic AI in the
world
Planning
 By comparison, planning relies on making direct
connections between states and actions.

Computers describe plans which are composed of states,
goals and actions using a system of formal logic. ‘Have
some apples’ is an English language description of a
goal;
The logical expression Have(apples) is its equivalent.

Actions are described in the same manner

8
Block II, Unit III, Symbolic AI in the
world
Planning
 Humans use their knowledge base to solve problems.


Figure out a computer program attempting to solve this
simple problem: buying apples.
With all the possible input and the encountered
constraints, this will not be an easy job!!
9
Block II, Unit III, Symbolic AI in the
world
Planning
 General actions: Buy(x), which results: having x
10
Block II, Unit III, Symbolic AI in the
world
Sub-Planning
 The planning process allows for the problem to be
broken into independent chunks known as sub-plans

An example of the success and failure of sub-planning
is illustrated in the following sections: Blocks world.
11
Block II, Unit III, Symbolic AI in the
world
Blocks world
 The real world is an incredibly complex and chaotic
place.

However, considering all of these fine details can
obscure the detail of how planning (and other tasks) is
done.

One answer might be to eliminate all the messy details
by constructing a very simple world in which the
planner can operate

The attention can be focused on the core problem, the
construction of the plan.
12
Block II, Unit III, Symbolic AI in the
world
Blocks world
 One such simplified world has played a leading part in
the development of AI systems. It is usually known as
Blocks World.


Blocks World was used as an environment for early
natural language understanding systems and robots
Blocks World is closely linked with the problem of
planning and with the early planning system, STRIPS.
13
Block II, Unit III, Symbolic AI in the
world
Blocks world
 Blocks World is a tiny ‘world’ comprising an (infinitely
large) flat table on which sit a set of children’s building
blocks.



The blocks can be moved around and stacked on top of
one another by a single robot hand.
The hand can only hold one block at a time.
Blocks world is most often simulated inside a computer,
so all blocks are presumed to be perfectly regular, the
movements of the arm infinitely precise.
14
Block II, Unit III, Symbolic AI in the
world
Blocks world
 Planning in Blocks World means deciding the steps
required to move blocks from an initial configuration
(the start state) to another configuration (the goal state).
On(B,C) ^ OnTable(C) ^ OnTable(A) ^ HandEmpty
15
Block II, Unit III, Symbolic AI in the
world
Blocks world
 The robot hand manipulates the world by picking up
blocks and moving them around.

A block x may only be picked up if both of the
following are satisfied:


The robot hand is empty (HandEmpty).
There is no block sitting on top of the selected block
(Clear(x)).
16
Block II, Unit III, Symbolic AI in the
world
Blocks world
 The hand can execute simple commands




PickUp(A) picks up Block A, provided that the block is
clear and the hand is empty; whilst
PutDown(A) places Block A on the table provided
that the hand is holding the block.
Stack(A,B) places Block A on top of Block B
provided the hand is holding A and that the top face
of B is clear;
UnStack(A,B) removes Block A from Block B
provided that the hand is empty and that the top of A
is clear.
17
Block II, Unit III, Symbolic AI in the
world
Blocks world
To describe the state
Process/command
On(x,y)
PickUp(x)
OnTable(x)
PutDown(x)
HandEmpty()
Stack(x,y)
Clear(x)
UnStack(x,y)
18
Block II, Unit III, Symbolic AI in the
world
Planning in the Blocks world
 Describe the initial state and the goal state of the
following:
19
Block II, Unit III, Symbolic AI in the
world
Planning in the Blocks world: divide the problem
 From the initial state we want to end up with Block A
on the table, Block C on the table and Block B on top of
Block A
20
Block II, Unit III, Symbolic AI in the
world
Planning in the Blocks world:
 The planner knows what actions it can perform, and the
consequences of those actions.

Actions are expressed as operators. Each operator has
four parts: its name, a set of preconditions, an add list
and a delete list.

The world changes with the execution of the operator,
by specifying which facts are added to and deleted from
the world state.
21
Block II, Unit III, Symbolic AI in the
world
Planning in the Blocks world:
22
Block II, Unit III, Symbolic AI in the
world
Planning using means-end analysis STRIPS:
 First, the goal conditions are added to the agenda.
 Planning then proceeds by popping the first condition from the
agenda and, if it’s not already true, finding an operator that can
achieve it.
 The operator’s action is then pushed on the agenda, as is each of
the operator’s precondition terms.
 Achieving each of these preconditions requires its own sub-plan.
 The process continues until the only things left on the agenda are
actions.
 If these are performed, in sequence, the goals will be achieved
23
Block II, Unit III, Symbolic AI in the
world
STRIPS: it starts with the three goals conditions being added to the
agenda:



OnTable(A)
On(B,A)
OnTable(C)
the topmost element, OnTable(A) is already true, so there is nothing to be
done to achieve it, it is popped from the agenda and discarded
The second term is not already true, so the system finds the Stack operator
to achieve it. Stack(B,A) is pushed onto the agenda and the operator’s
preconditions (Clear(A) and Holding(B)) are pushed on the agenda




Clear(A)
Holding(B)
Stack(B,A)
OnTable(C)
24
Block II, Unit III, Symbolic AI in the
world
The process begins again.
Clear(A) is already true, so that goal is discarded
without action. Holding(B) will become true
after an Unstack(B,C) operation, so that
operator is pushed on the stack together with its
preconditions, at this stage the agenda is:





Clear(B)
On(B,C)
UnStack(B,C)
Stack(B,A)
OnTable(C)
25
Block II, Unit III, Symbolic AI in the
world

The top two goals in the stack are true, so are popped from the
agenda

The two operations (Unstack(B,C) and Stack(B,A)) are
performed in that order

The final goal (OnTable(C)) is already true and so is removed.

As the agenda is empty, all the goals have been achieved and the
Clear(B)
On(B,C)
UnStack(B,C)
Stack(B,A)
OnTable(C)
planning has succeeded.
26
Example
Block II, Unit III, Symbolic AI in the
world
27
It is not
always
successful
Block II, Unit III, Symbolic AI in the
world
Goal state: On(A,B) and
On(B,C) and OnTable(C)
Sub-plans goals are achieved
Plan is not achieved (sussman anomaly)
The cause of the problem is the
implementation order and the
dependencies between sub-plans
28
Block II, Unit III, Symbolic AI in the
world
Planning using means-end analysis STRIPS partialorder planning systems :
 The technical term for when completing one sub-plan
undoes the achievements of another is Clobbering
 Solution: partial-order planning systems. The planner in

this case commits itself to ensuring that the operations for
each sub-plan occur in order, but they can be preceded,
followed or interleaved with steps from other sub-plans
Once all the actions for each sub-plan have been
described, the planner attempts to combine the actions in
such a way as to minimize clobbering.
29
Block II, Unit III, Symbolic AI in the
world
Robots:
 Purpose
 Categories/domains








Medical
Security
Services
Sub-marines work
Manufacturing
Mars missions
…
Examples
30
Block II, Unit III, Symbolic AI in the
world
Shakey (Stanford Robotics institute):
 1966
 Lived in an indoor environment
 Can perform simple tasks, such as going from
one room to another
 Nowadays, shakey is retired at the Computer
History Museum in Mountain View,
California, USA
31
Block II, Unit III, Symbolic AI in the
world
Shakey the robot:
32
Block II, Unit III, Symbolic AI in the
world
The Soviet Union moon probe: Lunokhod

On November 1970, Lunokhod entered the moon orbit

The first remotely operated vehicle to explore another world

Its length was 2.3 meters, its weight is around 750Kg

The rover would run during the lunar day, stopping
occasionally to recharge its batteries via the solar panels.

At night the rover hibernated until the next sunrise, heated by
the radioactive source.

Controllers finished the last communications session with
Lunokhod 1 at 13:05 UT on September 14, 1971

Lunokhod has been located by a research team from the
University of California at San Diego in 2010
33
Block II, Unit III, Symbolic AI in the
world
The Soviet Union moon rover: Lunokhod
34
Block II, Unit III, Symbolic AI in the
world
Spirit and Opportunity: Mars exploration rovers

Launched from earth in 2003

Landed on Mars early 2004

Opportunity robot standing 1.5 m, high, 2.3 m wide
and 1.6 m long and weighing 180 kg

Both rovers still alive, transferring images and Mars
soil test on daily basis, in addition to other scientific
results about Mars
35
Block II, Unit III, Symbolic AI in the
world
Opportunity:
Mars
exploration
rover
36
Block II, Unit III, Symbolic AI in the
world
Beagle 2: Mars exploration rovers (laboratory)

Beagle 2 was an unsuccessful British landing spacecraft that formed part of
the European Space Agency's 2003 Mars Express mission.

It is not known for certain whether the lander reached the Martian surface;

All contact with it was lost upon its separation from the Mars Express six
days before its scheduled entry into the atmosphere.

It may have missed Mars altogether, skipped off the atmosphere and entered
an orbit around the sun, or burned up during its descent.

If it reached the surface, it may have hit too hard or else failed to contact
Earth due to a fault.

It was a promising mission, Beagle 2 held advanced laboratory
37
Block II, Unit III, Symbolic AI in the
world
Learning, Adaptation and Heuristics

One characteristic that we would surely associate with an intelligent
individual, natural or artificial, is the ability to learn from its
environment, whether this means widening the range of tasks it can
perform or performing the same tasks better.

If we really want to understand the nature of intelligence, we have to
understand learning.

Another reason for investigating learning is to make the development
of intelligent systems easier

Rather than equipping a system with all the knowledge it needs, we
can develop a system that begins with adequate behavior, but learns
to become more competent.

The ability to learn is also the ability to adapt to changing
circumstances, a vital feature of any system.
38
Block II, Unit III, Symbolic AI in the
world
Learning, Adaptation and Heuristics

In Symbolic AI systems, behavior is governed by the processes defined for
that system.

If a system is to learn, it must alter these, by either modifying existing
processes or adding new ones.

Many existing learning systems have the task of classification: the system is
presented with a set of examples and learns to classify these into different
categories.

The learning can be either supervised (where the correct classifications are
known to the learner) or unsupervised (where the learner has to work out the
classifications for itself).
39
Block II, Unit III, Symbolic AI in the
world
Learning, Adaptation and Heuristics
Other approaches to automated learning include:

speed-up learning: In speed-up learning a system remembers
situations it has been in before and the actions it took then.
When it encounters a similar situation later, it decides on an
action by remembering what it did last time, rather than
determining it from first principles all over again;

inductive programming: A learning system is presented with
the inputs and desired outputs of a program or procedure. The
system has to derive the program that satisfies these constraints.
40
Block II, Unit III, Symbolic AI in the
world
Decision trees:

A decision tree is a way of classifying objects or situations.

Each leaf node of the tree represents a class the object could belong to;

Each internal node represents a test to get the value of an attribute of
the object.

As each attribute is tested, we move down the tree until we reach a
correct classification.

So a decision tree is a way of representing an order in which to ask
questions about an object (or directly observe its attributes) in order to
place it in the right class.
41
Block II, Unit III, Symbolic AI in the
world
Decision trees, an example
42
Block II, Unit III, Symbolic AI in the
world
Training data:
43
Block II, Unit III, Symbolic AI in the
world
Training data and learning:

A decision tree is a way of classifying objects or situations.

We identify the most discriminating attribute for the decision and to
split the data on the value of that attribute.

For instance, in the data shown in Table 3.4, the most discriminating
attribute seems to be ‘Schedule?’: if the student is behind schedule, the
student will always study; if she is on schedule, she studies more often
than not; otherwise she will often watch TV.


The next most important attribute appears to be ‘Good TV?’ – if there is
nothing good on the TV she will nearly always study M366.
Building the tree up level by level leads to the following partial tree
44
Block II, Unit III, Symbolic AI in the
world
Decision tree learned from the data table
45
Block II, Unit III, Symbolic AI in the
world
Decision tree learned from the data table
46
Block II, Unit III, Symbolic AI in the
world
Uncertainty:

AI systems are expected to move outside the laboratory so they must face
a world that is complex and, above all, uncertain ;

They will have to cope with that uncertainty.

As we all know, most human judgments are provisional. For instance:

when a weather forecaster informs us that it is going to rain tomorrow, we
know that she is not really expressing definite knowledge: she is only
offering a probability.

AI community has developed strategies for reasoning about situations
where precise information is either unavailable or unnecessary.
47
Block II, Unit III, Symbolic AI in the
world
Uncertainty:

The issue of uncertainty first came to prominence in diagnostic expert systems such as
MYCIN, a program for diagnosing bacterial blood infections.

Such systems have to account for imprecision in the results of tests and non-certain reasoning
steps, for example:
IF the stain of the organism is gram-positive
AND the morphology of the organism is coccus
AND the growth conformation of the organism is clump
THEN (0.7) the identity of the organism is staphylococcus

Here, the 0.7 is the certainty factor of this conclusion given the antecedents.

The certainty factors of each deduction enabled MYCIN to track how reliable it believed each
conclusion to be, and to report a final, combined certainty for the reliability of its diagnosis
back to the user.
48
Block II, Unit III, Symbolic AI in the
world
Uncertainty: Bayesian probability statistics




An AI approach that is widely used, is based on mathematical probability theory and Bayesian
probability statistics.
In the Bayesian view of probability, the probability of a proposition’s being true reflects the
strength of our belief in that proposition, generally in the light of some supporting
information.
The prior probability of a proposition h (such as ‘the battery is flat’) is written P ( h ) .
If we have some evidence e that can influence the probability of h (i.e. ‘the lights are dim’),
we can deduce the posterior or conditional probability of the proposition h given e , which
we denote as P ( h | e ) .
P ( h | e) 
P (e | h )  P ( h )
P ( e)
49
Block II, Unit III, Symbolic AI in the
world
Uncertainty: Bayesian probability statistics



P(e | h) is the probability of e being true if h is true
Example page146
The results is: the probability of having fire given that the alarm
sounds is 0.0094
50
Block II, Unit III, Symbolic AI in the
world
Fuzzy logic

fuzzy logic deals with the situation where we know all about an
entity but it belongs to more than one category.

Consider this question: am I (are you) very tall, tall, medium or
short? Which category do I (do you) belong to?

There’s no cut-and-dried answer to this question. I’m fairly tall –
taller than most of my colleagues – but a dwarf compared to the
average American basketball player.

I’m much taller than, say, the landlady of my house.

Illustration are presented on the next figure
51
Block II, Unit III, Symbolic AI in the
world
Fuzzy logic

fuzzy logic: the boundaries are fuzzy, this means that a person
might be tall in some contexts but short in others .