pages tagged linear algebraJason's Chatterhttp://lovesgoodfood.com/jason/tags/linear_algebra/Jason's Chatterikiwiki2011-04-19T18:44:59ZJust to put a stake in the groundhttp://lovesgoodfood.com/jason/posts/Just_to_put_a_stake_in_the_ground/2011-04-19T18:44:59Z2011-04-19T18:44:59Z
<p>I keep forgetting to write this up, but it’s related to static pivoting for symmetric indefinite problems. Existing work forms a weighted matching on the log of a matrix’s elements as in <a href="http://etna.mcs.kent.edu/vol.23.2006/pp158-179.dir/pp158-179.html">Schenk and Gärtner, 2006</a> and <a href="http://en.scientificcommons.org/980523">Duff and Pralett, 2005</a>.</p>
<p>Both report limited success, but they’re using the wrong weights and making the problem far too complicated.</p>
<p>Don’t use the matrix’s elements, use the potential <strong>pivots</strong>. In the symmetric indefinite case, that means replacing the off-diagonal elements by the 2x2 determinant of the respective submatrix. Element <em>A(i, j)</em> with <em>i ≠ j</em> is replaced by abs (det ([<em>A(i,i)</em>, <em>A(i,j)</em>; <em>A(j,i)</em>, <em>A(j,j)</em>])). Element <em>A(i,i)</em> is replaced by abs (<em>A(i,i)</em>) and is treated as a self-loop. If a maximum-weight matching pairs <em>i</em> with itself, use a 1x1 pivot. Otherwise use the respective 2x2 pivot. No need to find cycles, break loops, <i>etc</i>.</p>
<p>I haven’t tested the idea thoroughly, but I can’t see how it possibly could be worse than the more contrived contraptions. This feels like the natural analog, and also could map perturbations of small pivots directly to original entries.</p>
GMRES v. perturbationshttp://lovesgoodfood.com/jason/posts/GMRES_v_perturbations/2011-04-19T18:44:59Z2011-04-19T18:44:59Z
<p>Once upon a time, we considered using simple
<a href="http://www.netlib.org/linalg/html_templates/node29.html" title="GMRES
from the Templates book">GMRES</a> to fix an \( LU \) factorization perturbed along
the diagonal. More specifically, you sometimes need to perturb a tiny
pivot when you’ve
<a href="http://www.eecs.berkeley.edu/Pubs/TechRpts/2010/EECS-2010-172.html">chosen them statically</a>
before factorization.</p>
<p>The previous
<a href="http://crd.lbl.gov/~xiaoye/SC98/sc98.htm">SuperLU perturbation heuristics</a>
sometimes produced perturbations too large for iterative refinement to
fix. Applying a few steps of non-preconditioned GMRES appeared to
work, although it’s more expensive.</p>
<p>However, non-preconditioned GMRES cannot fix an arbitrary perturbation
in few steps. Consider an \(N\times N\) unit-entry, lower-bidiagonal matrix
\begin{equation*}
A = \left[\begin{array}{lc}
1 & 0 & 0 & & & 0 \newline
1 & 1 & 0 & \cdots & & \newline
0 & 1 & 1 & & & \newline
& \vdots & & \ddots& & \newline
& & & & 1 & 0 \newline
0 & & & & 1 & 1
\end{array}\right],
\end{equation*}
and a system \(A x = b\) with \(b = [1; -1; 1; -1; \ldots]\). The
true solution is \(x = [1; 2; 3; \ldots]\).</p>
<p>Perturb the first diagonal entry of \(A\) to \(\tilde{A}\) and
<em>every</em> component of the computed solution \(y = \tilde{A}^{-1} b\)
differs from \(x\), but the residual \(r\) is non-zero only in its
first entry. (You can play with the entries to render all
calculations exact; I just don’t have the worked exact example handy.)</p>
<p>The first iteration of GMRES searches the space
\(\operatorname{span} \{ r \}\) for an improvement to the computed \(y\).
That alters only the first component of \(y\). The second iteration
searches \(\operatorname{span} \{ r, A r \}\). Because \(A\) is upper
Hessenberg, this space only affects the first two components of \(y\).
In general, the \(i\)th iteration of GMRES only affects the first \(i\)
components of the computed solution \(y\).</p>
<p>The computed \(y\) differs from the true solution \(x\) in every
component, so plain GMRES requires \(N\) iterations to fix a single
perturbation.</p>
<p>If you use the perturbed factorization as a preconditioner, however,
the behavior is more difficult to analyze. I don’t have a clean
example of failing behavior handy, nor have I proven that it works for
sufficiently small perturbations. Definitely something to consider
further.
<a href="http://www.tau.ac.il/~stoledo/Bib/Pubs/AvronNgToledo.pdf">Avron, Ng, and Toledo (2009)</a>
show that preconditioning with \(R\) in a perturbed \(QR\) works for
least-squares problems.</p>