Math 303: Section 19
\[ \def\R{{\mathbb R}} \def\b{{\mathbf{b}}} \def\x{{\mathbf{x}}} \def\v{{\mathbf{v}}} \def\w{{\mathbf{w}}} \DeclareMathOperator{\null}{Nul} \]
Finding eigenvalues/eigenvectors is important! But the characteristic equation is often unwieldy and/or leads to approximations in its own right.
Let \(A\) be an arbitrary \(2\times 2\) matrix with two linearly independent eigenvectors \(\v_1,\v_2\) corresponding to eigenvalues \(\lambda_1,\lambda_2\); we assume \(|\lambda_1| > |\lambda_2|\)1.
Since \(\v_1\) and \(\v_2\) are linearly independent, for any \(\x_0\in \R^2\) there exist \(a_1, a_2\in \R\) for which
\[ \x_0 = a_1 \v_1 + a_2 \v_2; \]
thus
\[ \x_k = A^k \x_0 = a_1 \lambda_1^k \v_1 + a_2 \lambda_2^k \v_2. \]
Divide both sides of \(\x_k = a_1 \lambda_1^k \v_1 + a_2 \lambda_2^k \v_2\) by \(\lambda_1^k\); what happens as \(k\to \infty\)?
Assuming \(a_1\ne 0\)1, why do the vectors \(\x_k\) approach a vector in the direction of \(\v_1\) or \(-\v_1\)?
What does this tell us about the sequence \(\{\x_k\}\) as \(k\to\infty\)?
Let \(A\) be \(n\times n\), \(\lambda\) an eigenvalue, and \(\v\) the corresponding eigenvector.
Explain why \(\lambda = \frac{\lambda (\v\cdot \v)}{\v\cdot \v}\).
Use the previous result to explain why \(\lambda = \frac{(A\v)\cdot \v}{\v\cdot\v}\).
These quotients are called Rayleigh quotients.
If \(\x_k\) converges to a dominant eigenvector of \(A\), then \(r_k\) converges to the dominant eigenvalue of \(A\).