Markov Algorithms - Gunadarma University

Download Report

Transcript Markov Algorithms - Gunadarma University

Markov Algorithms
An Alternative Model of
Computation
An Algorithm Scheme
Example 4.1.1. Let  be alphabet {a, b, c, d}. By a
Markov algorithm scheme or schema we shall mean a finite
sequence of productions or rewrite rules. As a first
example, consider the following two-member sequence of
productions.
(i) a  c
(ii) b  
Another Example
Example 4.1.2. Let  be alphabet {a, b, c, d}. Next,
consider the following three-member sequence of
productions.
(i) a  c
(ii) bc  cb
(iii) b  .cd
Appends ab
Example 4.1.3. Let input alphabet  = {a, b}. Let work
alphabet  be {#}. We see that the Markov algorithm
scheme consisting of the four-member production sequence
(i) #a  a#
(ii) #b  b#
(iii) #  .ab
(iv)   #
has the effect of appending string ab to any word over .
Reverse Word
• Input word w over  = {a, b} transformed
by AS into output word wR
Markov Algorithms
• We can implement each of our three
computational paradigms
• Language acceptance (recognition)
• Function computation
• Transduction
A Formal Definition
Markov algorithm schema S = any triple , , 
 nonempty input alphabet
 finite work alphabet with   
 finite, ordered sequence of productions of form   
or of form   .
  and  (possibly empty) words over 
•



Language Acceptance
•
•
•
•
•
Example 4.2.1
Input alphabet  = {a, b}
Work alphabet  = {@, %, $, 1}.
Six productions/one of them terminal
Transforms all and only words in language
{anbm|n 0, m 1} to word 1
Some Conventions
•
•
•
•
S a Markov algorithm schema
Input alphabet 
work alphabet  with 1.
S accepts word w if w * 1
Markov-Acceptable Language
• Schema S accepts language L if S accepts
all and only words in L.
• A language that is accepted by some
Markov algorithm is said to be a Markovacceptable language.
Example
• Language {anbm|n 0, m 1} is accepted
by Example 4.2.1
Language Recognition
• Input alphabet  and work alphabet 
both 0, 1  \ 
• S recognizes language L over 
• S transforms w  L into 1, that is,
(accepting 1)
• S transforms w  L into 0, that is,
(rejecting 0)
• Language recognized by some Markov
S is said to be Markov-recognizable
•
such that
w * 1
w * 0
algorithm
Resources (Time)
• Definition
• timeS(n) = the maximum number of steps
in any terminating computation of S for
input of length n
Resources (Space)
• Computation of a Markov algorithm =
sequence of computation words
• Definition
• spaceS(n) = the maximum length of any
computation word in any terminating
computation of S for input of length n
Resource Analysis
• Example 4.2.3 accepts language
{w*|na(w) = nb(w)} with  = {a, b}
• timeS(n) = (n/2)2 + n + 2 for even n so O(n2)
• spaceS(n) = n + 1 so O(n)
• Compare single-tape Turing machine Same
Number of as and bs
Function Computation
• Example
• One terminal production 1  .11
• Computes unary successor function
Formal Definition
• S computes unary partial number-theoretic
function f
• S applied to input 1n + 1 yields output 1f(n) + 1
• If S applied to input 1n + 1, where function f
is not defined for n, then either S’
computation never terminates or its output
is not of form 1m
Two More Examples
• log n computes f(n) = log2n  if n > 0 and
undefined otherwise
• Example 4.3.6 computes binary function
f(n, m) = .|n m| mod 3
Labeled Markov Algorithm That Accepts
{w|na(w) = nb(w) = nc(w)}
L1: a  ; L2
  ; L4
L2: b  ; L3
  ; L5
L3: c  ; L1
  ; L5
L4: b  ; L5
c  ; L5
  1 ; L5
L5:   .
Other Examples
• Palindromes
• m divides n
• log n – labeled algorithms
Equivalence Result
• Let S be labeled Markov algorithm with
input alphabet . Then there exists standard
Markov algorithm S´ with input alphabet 
that is computationally equivalent to S.
• Converse obvious
Equivalence Result
• Class of Markov-computable functions Is
identical to the class of Turingcomputable functions
• Function f is Markov-computable iff f is
Turing-computable
Proof
• Example 4.5.1/Turing machine that
simulates a Markov algorithm
• For the other direction, see Example
4.5.2/Markov algorithm that simulates a
Turing machine
• Generalizations
• Given Turing machine M accepting L, there
exists Markov algorithm AS that accepts L in
O(timeM(n)) steps
• Given Markov algorithm AS accepting L,
there exists Turing machine M that accepts
L in O([timeAS(n)]4) steps
Another Example
  =  and single production 11  1
• Computes unary constant-0 function
C01(n) = 0 for all n
• So C01 is Markov-computable
Another Example
f(n) =
computed by
 2n 1 if n  1

0
if n = 0
$1  11$
111$  .1
11$  .1
$
Computing k-ary Functions
• if S applied to input 1n1 + 11n2 + 1…1nk + 1
then S yields output word 1f(n1, n2, …, nk) + 1
• if S applied to input 1n1 + 11n2 + 1…1nk + 1
where function f is not defined for arguments
n1, n2, …, nk, then either S never halts or
output word not of form 1m for m  1