pages tagged linear algebra Jason's Chatter http://lovesgoodfood.com/jason/tags/linear_algebra/ Jason's Chatter ikiwiki 2011-04-19T18:44:59Z Just to put a stake in the ground http://lovesgoodfood.com/jason/posts/Just_to_put_a_stake_in_the_ground/ 2011-04-19T18:44:59Z 2011-04-19T18:44:59Z <p>I keep forgetting to write this up, but it&rsquo;s related to static pivoting for symmetric indefinite problems. Existing work forms a weighted matching on the log of a matrix&rsquo;s elements as in <a href="http://etna.mcs.kent.edu/vol.23.2006/pp158-179.dir/pp158-179.html">Schenk and G&auml;rtner, 2006</a> and <a href="http://en.scientificcommons.org/980523">Duff and Pralett, 2005</a>.</p> <p>Both report limited success, but they&rsquo;re using the wrong weights and making the problem far too complicated.</p> <p>Don&rsquo;t use the matrix&rsquo;s elements, use the potential <strong>pivots</strong>. In the symmetric indefinite case, that means replacing the off-diagonal elements by the 2x2 determinant of the respective submatrix. Element <em>A(i, j)</em> with <em>i &ne; j</em> is replaced by abs (det ([<em>A(i,i)</em>, <em>A(i,j)</em>; <em>A(j,i)</em>, <em>A(j,j)</em>])). Element <em>A(i,i)</em> is replaced by abs (<em>A(i,i)</em>) and is treated as a self-loop. If a maximum-weight matching pairs <em>i</em> with itself, use a 1x1 pivot. Otherwise use the respective 2x2 pivot. No need to find cycles, break loops, <i>etc</i>.</p> <p>I haven&rsquo;t tested the idea thoroughly, but I can&rsquo;t see how it possibly could be worse than the more contrived contraptions. This feels like the natural analog, and also could map perturbations of small pivots directly to original entries.</p> GMRES v. perturbations http://lovesgoodfood.com/jason/posts/GMRES_v_perturbations/ 2011-04-19T18:44:59Z 2011-04-19T18:44:59Z <p>Once upon a time, we considered using simple <a href="http://www.netlib.org/linalg/html_templates/node29.html" title="GMRES from the Templates book">GMRES</a> to fix an $$LU$$ factorization perturbed along the diagonal. More specifically, you sometimes need to perturb a tiny pivot when you&rsquo;ve <a href="http://www.eecs.berkeley.edu/Pubs/TechRpts/2010/EECS-2010-172.html">chosen them statically</a> before factorization.</p> <p>The previous <a href="http://crd.lbl.gov/~xiaoye/SC98/sc98.htm">SuperLU perturbation heuristics</a> sometimes produced perturbations too large for iterative refinement to fix. Applying a few steps of non-preconditioned GMRES appeared to work, although it&rsquo;s more expensive.</p> <p>However, non-preconditioned GMRES cannot fix an arbitrary perturbation in few steps. Consider an $$N\times N$$ unit-entry, lower-bidiagonal matrix \begin{equation*} A = \left[\begin{array}{lc} 1 &amp; 0 &amp; 0 &amp; &amp; &amp; 0 \newline 1 &amp; 1 &amp; 0 &amp; \cdots &amp; &amp; \newline 0 &amp; 1 &amp; 1 &amp; &amp; &amp; \newline &amp; \vdots &amp; &amp; \ddots&amp; &amp; \newline &amp; &amp; &amp; &amp; 1 &amp; 0 \newline 0 &amp; &amp; &amp; &amp; 1 &amp; 1 \end{array}\right], \end{equation*} and a system $$A x = b$$ with $$b = [1; -1; 1; -1; \ldots]$$. The true solution is $$x = [1; 2; 3; \ldots]$$.</p> <p>Perturb the first diagonal entry of $$A$$ to $$\tilde{A}$$ and <em>every</em> component of the computed solution $$y = \tilde{A}^{-1} b$$ differs from $$x$$, but the residual $$r$$ is non-zero only in its first entry. (You can play with the entries to render all calculations exact; I just don&rsquo;t have the worked exact example handy.)</p> <p>The first iteration of GMRES searches the space $$\operatorname{span} \{ r \}$$ for an improvement to the computed $$y$$. That alters only the first component of $$y$$. The second iteration searches $$\operatorname{span} \{ r, A r \}$$. Because $$A$$ is upper Hessenberg, this space only affects the first two components of $$y$$. In general, the $$i$$th iteration of GMRES only affects the first $$i$$ components of the computed solution $$y$$.</p> <p>The computed $$y$$ differs from the true solution $$x$$ in every component, so plain GMRES requires $$N$$ iterations to fix a single perturbation.</p> <p>If you use the perturbed factorization as a preconditioner, however, the behavior is more difficult to analyze. I don&rsquo;t have a clean example of failing behavior handy, nor have I proven that it works for sufficiently small perturbations. Definitely something to consider further. <a href="http://www.tau.ac.il/~stoledo/Bib/Pubs/AvronNgToledo.pdf">Avron, Ng, and Toledo (2009)</a> show that preconditioning with $$R$$ in a perturbed $$QR$$ works for least-squares problems.</p>