The Future of Computer Science

Download Report

Transcript The Future of Computer Science

Exploring the Limits of the
Efficiently Computable
Research Directions in Computational Complexity and
Physics That I Find Exciting
Scott Aaronson (MIT)
Papers & slides at www.scottaaronson.com
Quantum Mechanics in One Slide
Probability Theory:
 s11

 
s
 n1



s1 n   p 1   q 1 

  
       
s nn   p n   q n 
Quantum Mechanics:
 u 11

 
u
 n1



u1 n    1    1 

 

       
u nn    n    n 
n
pi  0,

n
pi  1
i 1
Linear transformations
that conserve 1-norm of
probability vectors:
Stochastic matrices
 i  C,

2
i
1
i 1
Linear transformations
that conserve 2-norm of
amplitude vectors:
Unitary matrices
Quantum Computing
A general entangled state of n qubits requires ~2n amplitudes
to specify:
 

x   0 ,1 
x
x
n
Presents an obvious practical problem when using
conventional computers to simulate quantum mechanics
Feynman 1981: So then why not turn things around, and
build computers that themselves exploit superposition?
Could such a machine get any advantage over a classical
computer with a random number generator? If so, it would
have to come from interference between amplitudes
BQP (Bounded-Error Quantum Polynomial-Time): The class
of problems solvable efficiently by aInteresting
quantum computer,
defined by Bernstein and Vazirani in 1993
Shor 1994: Factoring integers is in BQP
NP-complete
NP
BQP
Factoring
P
Examples of My Past Work
Quantum lower bound for the collision problem [A. 2002]
Quantum (+classical!) lower bound for local search [A. 2004]
First direct product theorem for quantum search [A. 2004]
PostBQP = PP [A. 2004]
BQP vs. the polynomial hierarchy: black-box relation
problems in BQP but not BPPPH [A. 2009]
Publicly-verifiable quantum money [A.-Christiano 2012]
BQP/qpoly  QMA/poly, learnability of quantum states [A.Drucker 2010, A. 2004, A. 2006]
Algebrization [A.-Wigderson 2008]
This Talk: Three Recent Directions
1. Meeting Experimentalists Halfway
Using complexity theory to find quantum advantage in
systems of current experimental interest (e.g. linear-optical
networks), which fall short of universal quantum computers
2. Computational Complexity and Black Holes
An amazing role for complexity theory in the recent
“firewall” debate and the AdS/CFT correspondence
3. Physical Universality
When Turing-universality isn’t enough: the complexity and
realizability of physical transformations
1. Meeting Experimentalists
Halfway
BosonSampling (A.-Arkhipov 2011)
A rudimentary type of quantum computing, involving
only non-interacting photons
Classical counterpart:
Galton’s Board
Replacing the balls by photons leads to
famously counterintuitive phenomena,
like the Hong-Ou-Mandel dip
In general, we consider a network of
beamsplitters, with n input “modes” (locations)
and m>>n output modes
n identical photons enter, one per input mode
Assume for simplicity they all leave in
different modes—there are  m  possibilities
n
 
The beamsplitter network defines a column-orthonormal
matrix ACmn, such that Pr outcome S   Per  A  2
S
where Per  X  
n
 x
  S n i 1
i ,  i 
nn submatrix of A
corresponding to S
Amazing connection between permanents and physics, which
even leads to a simpler proof of Valiant’s famous result that
the permanent is #P-complete [A. 2011]
So, Can We Use Quantum Optics to
Solve a #P-Complete Problem?
That sounds way too good to be true…
Explanation: If X is sub-unitary, then |Per(X)|2
will usually be exponentially small. So to get a
reasonable estimate of |Per(X)|2 for a given X,
we’d generally need to repeat the optical
experiment exponentially many times
Better idea: Given ACmn as input, let BosonSampling
be the problem of merely sampling from the same
distribution DA that the beamsplitter network samples
from—the one defined by Pr[S]=|Per(AS)|2
Theorem (A.-Arkhipov 2011): Suppose BosonSampling is
#P=BPPNP
solvable
in
classical
polynomial
time.
Then
P
Upshot: Compared to (say) Shor’s factoring
algorithm,
we get
different/stronger
evidence
Better
Theorem:
Suppose
we can sample
DA eventhat a
weaker system
can dopolynomial
somethingtime.
classically
approximately
in classical
Thenhard
in
BPPNP, it’s possible to estimate Per(X), with high
n n
probability over a Gaussian random matrix X ~ Ν 0 ,1 C
We conjecture that the above problem is already
#P-complete. If it is, then a fast classical
algorithm for approximate BosonSampling would
already have the consequence that P#P=BPPNP
Challenges
Prove #P-completeness for natural average-case
approximation problems (like permanents of Gaussians)—
thereby yielding hardness for approximate BosonSampling
As a first step, understand the distribution of Per(X), X Gaussian
Early experimental implementations have been done (Rome,
Brisbane, Vienna, Oxford)! But so far with just 3-4 photons.
For scaling, will be crucial to understand the complexity of
BosonSampling when a constant fraction of photons are lost
Can the BosonSampling model solve hard “conventional”
problems? How do we verify that a BosonSampling device is
working correctly? [A.-Arkhipov 2014, A.-Nguyen 2014]
BosonSampling with thermal states: fast classical algorithm to
approximate Per(X) when X is positive semidefinite?
2. Computational Complexity and
Black Holes
Most striking application so far of complexity
to fundamental physics?
Hawking 1970s: Black holes radiate
The radiation seems thermal (uncorrelated with whatever
fell in)—but if quantum mechanics is true, then it can’t be
Susskind et al. 1990s: “Black-hole complementarity.” In
string theory / quantum gravity, the Hawking radiation
should just be a scrambled re-encoding of the same
quantum states that are also inside the black hole
The Firewall Paradox [Almheiri et al. 2012]
If the black hole interior is “built”
out of the same qubits coming out as
Hawking radiation, then why can’t
we do something to those Hawking
qubits (after waiting ~1070 years for
enough to come out), then dive into
the black hole, and see that we’ve
completely destroyed the spacetime
geometry in the interior?
Entanglement among
Hawking photons detected!
Harlow-Hayden 2013: Sure, there’s some unitary
transformation that Alice could apply to the Hawking
radiation, that would generate a “firewall” inside the event
horizon. But how long would it take her to apply it?
They showed: A natural formalization of Alice’s decoding
task is QSZK-hard
(QSZK = Quantum Statistical Zero Knowledge)
My 2002 collision lower bound suggests that QSZKBQP.
In that case, decoding would presumably take time
exponential in the number of qubits of the black hole—so
the black hole would’ve evaporated before Alice had even
made a dent!
R = Faraway Hawking Radiation
B = Just-Emitted Hawking Radiation
H = Interior
of “Old”
Black Hole
The HH Decoding Problem
Given a description of a quantum circuit C, such that
C 0
n
 
RBH
Promised that, by acting only on R (the “Hawking
radiation part”), it’s possible to distill an EPR pair
0 0  1 1
2
between R and B
Problem: Distill such an EPR pair, by applying a unitary
transformation UR to the qubits in R
My strengthening: Harlow-Hayden decoding is as
hard as inverting an arbitrary one-way function

RBH

1
2
2 n 1

f  x , s , a
x , s 0 ,1 , a 0 ,1
R
x  s   a
B
x, s
H
n
R: “old” Hawking photons / B: photons just coming out / H: still in black hole
B is maximally entangled with the last qubit of R. But in order to
see that B and R are even classically correlated, one would need to
learn xs (a “hardcore bit” of f), and therefore invert f
With realistic dynamics, the decoding task seems like it should only
be “harder” than in this model case (though open how to formalize
that)
Is computational intractability the only
“armor” protecting the geometry of
spacetime inside the black hole?
Quantum Circuit Complexity and Wormholes
[A.-Susskind, in progress]
The AdS/CFT correspondence relates antideSitter quantum gravity in D spacetime
dimensions to conformal field theories
(without gravity) in D-1 dimensions
Conjecture: The
But the mapping is Susskind’s
extremely nonlocal!
quantum circuit complexity of a CFT
It was recently found that an expanding wormhole, on the AdS
state can encode information about
side, maps to a collection of qubits on the CFT side that just
the geometry of the dual AdS.
n
seems to get more and more “complex”:
 00  11 
Not clear if it’s true, but thas




U
t
survived some nontrivial tests


2


Theorem: Suppose U implements (say) a computationally-universal
cellular automaton. Then after t=exp(n) iterations, |t has
superpolynomial quantum circuit complexity unless PSPACEPP/poly
3. Physical Universality
Four Related Questions
For every n-qubit unitary U, is there a Boolean function f
such that U can be implemented in BQPf?
Which n-qubit unitaries could we efficiently implement if
P=PSPACE?
Can every n-qubit unitary be implemented by a quantum
circuit with poly(n) depth (but maybe exp(n) ancilla
qubits)?
Could we prove—unconditionally, with today’s
technology—that exponentially many gates are needed to
implement some n-qubit unitary U?
Generalizations of the Natural Proofs barrier?
A Grand Challenge
Can we classify all possible sets of quantum gates acting
on qubits, in terms of which unitary transformations they
approximately generate?
“Quantum Computing’s Classification of Finite Simple Groups”
Warmup: Classify all the possible Hamiltonians / Lie
algebras. Even just on 1 and 2 qubits!
A.-Bouland 2014: Every nontrivial two-mode beamsplitter
is universal
Baby case that already took lots of representation theory…
The Classical Case
A.-Grier-Schaefer 2015:
Classified all sets of reversible
gates in terms of which
reversible transformations
F:{0,1}n{0,1}n they generate
(assuming swaps and ancilla bits
are free)
CNOT
Toffoli
Fredkin
Cellular Automata Beyond Turing-Universality
Schaeffer 2014: The
first known
“physically-universal”
cellular automaton
(able to implement any
transformation in any
bounded region, by
suitably initializing the
complement of that
region)
The Coffee Automaton
A., Carroll, Mohan, Ouellette, Werness 2015: Detailed
study of the rise and fall of “complex organization,” in a
reversible cellular automaton that models the
thermodynamic mixing of cream into coffee
Compressed File Size
We prove that, under coarsegraining, the Kolmogorov
complexity of this image has
a rising-falling pattern
500
450
400
350
300
250
200
150
100
50
0
-100
100
300
500
Time Steps
700
900
Summary
Quantum computing established a remarkable intellectual
bridge between computer science and physics
That’s always been why I’ve cared! Actual devices would be a bonus
My research agenda: to see just how much weight this
bridge can carry
Rebuilding physics in the language of computation won’t
be nearly as easy as Stephen Wolfram thought! Not only
does it require engaging our actual understanding of
physics (QM, QFT, AdS/CFT…); it requires hard
mathematical work, often making new demands on
theoretical computer science
But sure, I think it’s ultimately possible