Hermitian matrices are stable under small, additive perturbations, but this fact fails dramatically to generalize to the non-Hermitian case, as there are non-diagonalizable n x n matrices whose spectra move by O(ε^n) after an ε-perturbation. This issue is especially concerning for numerical linear algebra applications, where even the presence of routine machine noise can drastically alter the spectrum of a modestly sized matrix–and is mitigated only by the fact that the non-diagonalizable matrices of any dimension have measure zero.
In this talk I'll quantify this fact: a small entry-wise Gaussian perturbation εG of any n x n matrix A has a basis of eigenvectors with condition number poly(n), and eigenvalue gaps 1/poly(n). The main technique exploits the relationship between pseudospectrum and eigenvector condition number, reducing the problem to the proof of certain tail bounds on small singular values of the matrices zI – A – εG, for generic complex z. Time permitting, I'll discuss extensions to a numerical linear algebra application, where this random regularization is used as a preconditioning step in an algorithm to rapidly approximate the eigenvectors and eigenvalues of any matrix.
This is based on joint work with Jorge Garza Vargas, Archit Kulkarni, Satyaki Mukherjee, and Nikhil Srivastava, and found in these three papers:
Gaussian regularization of the pseudospectrum and Davies' conjecture
Overlaps, eigenvalue gaps, and pseudospectrum under real Ginibre and absolutely continuous perturbations
Pseudospectral shattering, the sign function, and diagonalization in nearly matrix multiplication time