Spectral Properties and GMRES Performance for Numerically

Download Report

Transcript Spectral Properties and GMRES Performance for Numerically

Preconditioning the Integral Formulation of
the Helmholtz Equation via Deflation
Department of Computational and Applied Mathematics
Rice University
Josef Sifuentes
Master of Arts
Thesis Defense
Advisors
Dr. Mark Embree
Dr. McKay Hyde
Outline
I will present the scattering phenomena we are trying to simulate
Wave Equation ! Helmholtz Equation ! Lippmann-Schwinger Equation
Describe the advantages and challenges to taking an integral approach
Describe spectral properties of the linear system as a function of the
wave parameter
Describe GMRES performance as a function of the spectrum of the
coefficient matrix
I will present a preconditioner based on eigenvalue deflation
We can use coarse mesh calculations to build a right preconditioner
The preconditioner is based on the geometry of the scattering obstacle and
the wave parameter
Introduction - Modeling Wave Scattering
 We address the problem of scattering electromagnetic
waves that are
 Time harmonic of a given frequency
 Satisfy outgoing boundary conditions
 Scattered by a compactly supported obstacle in 2-D
 Electromagnetic waves and waves that result from
acoustics are both governed by similar physical laws,
and are therefore relevant to many applications
Applications
Wave scattering applications are crucial to a large
spectrum of disciplines in science and engineering
Laser-Driven Fusion (laserplasma interactions)
Radar, Sonar, and Remote
Sensing
Medical and Biological
Imaging
Materials Science
Communications
Exploration Seismology
The Wave Equation
The phenomena of time-harmonic waves scattered by a twodimensional scattering obstacles is described by the wave
equation
c is the speed of wave propagation
n(x) is the refractive index of the scattering obstacle
v(x,t) satisfies outgoing boundary conditions
The Helmholtz Equation
Since we are interested in time-harmonic waves of some given
frequency, we can write our solution v(x,t) as
Factoring out the time component of the solution to the wave equation
gives the elliptic Helmholtz equation in space
 = /c is the wave number
Our solution u(x) to the Helmholtz equation is the sum of the scattered
field us and the incident wave ui
We require that us be outgoing, i.e., that it satisfy the Sommerfeld
radiation conditions
We also require that the incident wave satisfy the Helmholtz equation
outside the scattering field
An Integral Equation Formulation
An equivalent formulation is given by the Lippmann-Schwinger
Equation
g(z) = (i/4)H10 ( |x|) is the free-space Green’s function for R2
m(x) = 1 – n(x)2 is compactly supported by the scattering domain 
This can be rearranged into a Fredholm integral operator of the second
kind on u(x):
An Integral Equation Formulation
Discretizing the integral equation using a composite trapezoid method
gives the equation
N is the number of mesh nodes
h2 is the area of each partition
uk is the approximate solution evaluated at xk 2 
We can rewrite these equation as the linear system
Advantages
Only the scattering obstacle needs to be discretized
A solution to the integral equation automatically satisfies the outgoing
radiation conditions
Convergence is governed by the wave parameter , with only a mild
dependence on the mesh size (provided the mesh is sufficiently resolved
for the wave number ).
The matrix KM is structured so that a Fast Fourier Transform method
(FFT) can be used to accelerate the matrix-vector product operation to
O(N log N) operations, as opposed to O(N2) operations for a dense
system. This will be important in implementing an iterative solver
Challenges
K is dense and non-Hermitian of dimension N = number of mesh points.
The number of mesh points must be at least twice  in each direction in order
to resolve the frequency of the wave, i.e. N ¸ 42
We will show that the number of GMRES iterations necessary to converge is
on the order of O(2)
Although we can use an FFT method to accelerate the matrix-vector routine to
O(NlogN) operations, the cost of orthogonalization of the Krylov subspace is
O(kN) at iteration k
This gives an overall computational cost of O(6).
For even modest values of , this cost can become prohibitive without an
effective preconditioner.
My Main Contributions
 I will describe the spectral properties of the coefficient matrix as a
function of the wave parameter 
 I will describe GMRES performance as a function of the spectral
properties
 By identifying the portion of the spectrum that cause GMRES to
stagnate we propose a right preconditioner based on eigenvalue
deflation
 The preconditioner is a function of the geometry of the scattering
obstacle and the wave number  and independent of the incident
wave.
 The preconditioning matrix can be computed using coarse mesh
approximations
Spectral Properties – The Eigenvalues
Consider the square scattering obstacle  = [0,1] x [0,1] with a constant refractive
index of m= -1
The integral operator Km is compact, and its spectrum has a limit point at 0. Similarly the
eigenvalues of the matrix K cluster at the origin.
The eigenvalues of the coefficient matrix A = I + 2K, therefore has a cluster of eigenvalues at
1.
The eigenvalues lie approximately on a circle in the complex plane tangent to the real line at 1.
The radius of the circle grows linearly with .
 = 10
 = 20
 = 30
 = 40
N = 400
N = 1600
N = 3600
N = 6400
Spectral Properties – The Eigenvalues
Two factors of the spectrum cause GMRES to stagnate
A subset of the spectrum becomes closer to the origin as  increases
The number of well separated eigenvalues along the perimeter of the
spectrum distribution increases like O(2)
This growth of well separated eigenvalues will be portentous of the growth of
iterations required for GMRES convergence
Any deflation based preconditioner should address this subset of the spectrum to
be successful
Pseudospectra
 = 10
Pseudospectra
 = 20
Spectral Properties – The Eigenvectors
The eigenvectors of A are discrete approximations to some of the eigenfunctions of
the integral equation on the computational grid.
We observe there is a natural ordering of frequency of the eigenvectors in relation
to the distribution of the corresponding eigenvalues.
The eigenfunctions that correspond to the well-separated eigenvalues, apart from
the cluster at one, are low frequency waves, with the lowest corresponding to the
eigenvalue closest to 2 on the real line, and increasing in frequency as the
corresponding eigenvalue move clockwise.
Spectral Properties – The Eigenvectors
 This frequency ordering is provocative in that it suggests coarse
mesh calculations should give reasonable approximations of the low
frequency subset of the spectrum.
 We will show that it is exactly this subset corresponding to low
frequency eigenfunctions that cause GMRES to stagnate.
 Deflating this eigenspace based on interpolating coarse mesh
calculations onto a mesh that is sufficiently resolved for the wave
number .
 This suggests a multigrid or two-grid approach to preconditioning
this linear system could be successful.
GMRES Performance as a Function of the
Spectrum
Our goal is to solve
where
Au = ui
A = I + 2 K
GMRES is an iterative solver that at iteration k minimizes the 2-norm of the
residual rk over the k-dimensional Krylov subspace
The basis vectors of the Krylov subspace are products of increasing
powers of A and the right hand side vector ui.
GMRES Performance as a Function of the
Spectrum
The vector inside the norm is the product of a polynomial of A and the right
hand vector ui
Where Pk is the set of polynomials such that pk(0) = 1 and is of degree at
most k.
Numerical experiments show A to be nondefective, i.e., the matrix V
consisting of eigenvectors of A as columns is full rank. Then we can
diagonalize the matrix A.
AV = V
GMRES Performance as a Function of the
Spectrum
Where (A) is the set of eigenvalues of A
(V) is the condition number of V, (V) = ||V|| ||V-1||
GMRES Performance – Numerical Results
Residual History for  = 10,15, …,40
Iterations to converge for  = 10,15, …, 40
The GMRES iteration count scales quadratically with 
The residual history exhibits two stages of convergence
A period of stagnation corresponding to the deflation of the set of well-separated
eigenvalues
A period of rapid convergence corresponding to the deflation of the eigenvalues
clustered around 1.
GMRES Performance – The Movie
An Idealized Preconditioner
We verify the description of GMRES performance described previously and
develop an idealized preconditioner via oblique projections
Recall that V is full rank, and therefore
Let c the vector such that
where V = [v1 v2 … vN], and the set of eigenvectors
are
ordered in terms of frequency, that is in clockwise order of the
corresponding eigenvalues.
An Idealized Preconditioner
Define V1 = [v1 v2 … v ] , V2 = [vr+1 vr+2 … vN] for r ¿ N
r
Then we can write ui in terms of the first r eigenvectors and the last N-r
eigenvectors
We remove the eigenvalues that cause GMRES to stagnate by obliquely
projecting ui onto the subspace of the eigenvectors corresponding to the
clustering at 1.
We apply GMRES to the following
An Idealized Preconditioner
At iteration k, GMRES now solves
We recover the solution
Since the GMRES polynomial is required to be small only on the
remaining N-r eigenvalues, which are clustered around 1 for sufficiently
large r, we should expect that GMRES would converge rapidly.
An Idealized Preconditioner – The Movie
Numerical Results
As a larger eigenspace is projected out of the right hand side, the number of
iterations necessary for convergence decreases.
We see that not until some critical mass of eigenvalues are deflated do we see a
payoff in the number of iterations required for convergence.
This confirms our description of GMRES stagnation caused by the set of outlying
eigenvalues that are well separated and the subset of that which grows closer to the
origin
Eigenvalue Deflation
Our goal is to design a preconditioner that is
independent of the of the incident wave ui
requires only the first r eigenvectors
Eigenvalue deflation based preconditioners have often been used to
accelerate restarted GMRES convergence
The restarted GMRES algorithm restarts the iteration process after a set
number of iterations and uses the last iterate as the initial guess of the new
cycle.
Though this saves storage costs, it does so at the cost the accumulated
information about the spectrum GMRES has accumulated.
Therefore spectral information obtained through the Arnoldi iteration is
often used to augment the Krylov subspace at the subsequent cycle in the
form of a preconditioner in the hopes of mimicking the performance of full
GMRES
Right Preconditioner via Deflation
One such approach is given by Erhel, Burrage, and Pohl
Theorem
Let P be an invariant subspace of dimension r, corresponding to the 1st r
eigenvalues of A
Let the columns of the matrix U form an orthonormal basis of P
If T = U*AU, and M = In + U(|n|-1 T – Ir)U*. Then M is nonsingular and
M-1 = In + U(T-1 – Ir)U*
And the eigenvalues of AM-1 are r+1, r+2, ..., n, 1 (the last with multiplicity of
at least r).
Numerical Results
 = 10
We test this approach by computing V1 for a fine mesh and define
P = R(V1),
U = orth(P).
Then we use GMRES to solve
AM-1 (Mu) = ui.
Numerical Results
 = 20
The performance of this preconditioner is similar to that of the oblique
projections of our idealized preconditioner, as would expect
As the dimension r of the deflated eigenspace increases, the number
of GMRES iterations decrease until a plateau corresponding to GMRES
having to deflate only the cluster around 1.
Numerical Results
 = 30
An unexpected and thus far unexplained result is that this
preconditioner shows immediate results for low values of r.
In the case of the oblique projections, we had to reach some critical
mass of the dimension size of the deflated eigenspace before any
significant results are shown.
Using Approximate Eigenspaces
 A feature of this right preconditioner is that it can be computed once
and stored for a given geometry and scattering obstacle.
 However we can improve on this cost by computing the eigenspace
we wish to deflate using coarse mesh approximations.
 Recall that the eigenspace we wish to deflate correspond to low
frequency eigenfunctions.
 We compute an r-dimensional eigenspace by interpolating the
eigenvectors using bilinear, bicubic, and cubic spline interpolants.
Numerical Results for Interpolation based
Preconditioner
For wave number  = 10
dimension of the deflated eigenspace r = 40
n = 5,10, …, 2, where n is the number of nodes in each direction, i.e.
N = n2
Numerical Results for Interpolation
based Preconditioner
For wave number  = 20
dimension of the deflated eigenspace r = 160
n = 5,10, …, 2, where n is the number of nodes in each direction, i.e.
N = n2
Numerical Results for Interpolation based
Preconditioner
For wave number  = 30
dimension of the deflated eigenspace r = 360
n = 5,10, …, 2, where n is the number of nodes in each direction, i.e.
N = n2
Dependence on Order of Interpolation
= 10
 = 20
 = 30
r = 40
r = 160
r = 360
n = 10
n = 15
n = 25
Conclusion
 We identify the portion of the spectrum that causes GRMES to
stagnate
 We show that this subset corresponds to low frequency
eigenfunctions
 We exploit this by building a preconditioner based on deflating an
approximate eigenspace calculated from coarse mesh
approximations
 The preconditioner can be stored as a rank r update to the identity
matrix for a given geometry and wave number 
 Then for simulations involving a given scattering obstacle and various
incident waves we can store and reuse the preconditioning matrix
Future Work
 Application of this preconditioner to more complex geometries and
refractive index functions.
 Implementing this preconditioner for high order integral equations
approaches which involve a change of variables to resolve the
singularity in the Green’s Function.
 We essentially use a two-grid method to precondition this equation.
This may possibly provide the framework for other two-grid methods
or multi-grid methods.
Acknowledgments





Dr. Mark Embree
Dr. McKay Hyde
Dr. Tim Warburton
Dr. Dan Sorensen
John Sabino (this is his laptop)