Math 303: Section 28

Dr. Janssen

\[ \def\R{{\mathbb R}} \def\P{{\mathbb P}} \def\B{{\mathcal B}} \def\C{{\mathcal C}} \def\S{{\mathcal S}} \def\b{{\mathbf{b}}} \def\c{{\mathbf{c}}} \def\x{{\mathbf{x}}} \def\y{{\mathbf{y}}} \def\u{{\mathbf{u}}} \def\v{{\mathbf{v}}} \def\w{{\mathbf{w}}} \def\z{{\mathbf{z}}} \def\e{{\mathbf{e}}} \def\r{{\mathbf{r}}} \def\M{{\mathcal{M}}} \DeclareMathOperator{\null}{Nul} \DeclareMathOperator{\span}{Span} \DeclareMathOperator{\dim}{dim} \DeclareMathOperator{\proj}{proj} \DeclareMathOperator{\row}{Row} \DeclareMathOperator{\col}{Col} \newcommand{\set}[1]{\left\{ {#1} \right\}} \newcommand{\setof}[2]{{\left\{#1\,\colon\,#2\right\}}} \newcommand{\norm}[1]{{\left|\! \left| #1 \right| \! \right|}} \]

A nonempty subset \(S\) of \(\R^n\) is **orthogonal** if \(\u\cdot\v = 0\) for every pair of distinct vectors \(\u,\v\in S\).

Orthogonal bases \(\B = \set{\v_1, \v_2, \v_3}\) are important and really nice!

If \(\x = x_1 \v_1 + x_2 \v_2 + x_3 \v_3\), then \(x_i = \frac{\x\cdot \v_1}{\v_1\cdot\v_1}\).

Let \(\w_1 = \left[\begin{matrix} -2 \\ 1 \\ -1 \end{matrix}\right], \w_2 = \left[\begin{matrix} 0 \\ 1 \\ 1 \end{matrix}\right]\), and \(\w_3 = \left[\begin{matrix} 1 \\ 1 \\ -1 \end{matrix}\right]\). We can show that \(S_1 = \set{\w_1, \w_2, \w_3}\) is an orthogonal subset of \(\R^3\) (but you don’t need to).

- Is the set \(S_2 = \set{\w_1, \w_2, \w_3, \left[\begin{matrix} 1 \\ 2 \\ 0 \end{matrix}\right]}\) an orthogonal subset of \(\R^3\)?
- Suppose a vector \(\v\) satisfies \(S_1 \cup \set{\v}\) is an orthogonal subset of \(\R^3\). Then \(\w_i\cdot \v = 0\) for each \(i\). Explain why this means that \(\v\in \null A\), where \(A = \left[\begin{matrix} -2 & 1 & -1 \\ 0 & 1 & 1 \\ 1 & 1 & -1 \end{matrix}\right]\).
- Assuming that the reduced row echelon form of the matrix \(A\) is \(I_3\), explain why it is not possible to find a nonzero vector \(\v\) so that \(S_1\cup \set{\v}\) is an orthogonal subset of \(\R^3\).

Let \(\set{\v_1, \v_2, \ldots, \v_m}\) be a set of nonzero orthogonal vectors in \(\R^n\). Then the vectors \(\v_1, \v_2, \ldots, \v_m\) are linearly independent.

Let \(\B = \set{\v_1, \v_2, \ldots, \v_m}\) be an orthogonal basis for a subspace \(W\) of \(\R^n\). Let \(\x\) be a vector in \(W\). Then

\[ \x = \frac{\x\cdot\v_1}{\v_1\cdot\v_1} \v_1 + \frac{\x\cdot\v_2}{\v_2\cdot\v_2} \v_2 + \cdots + \frac{\x\cdot\v_m}{\v_m\cdot\v_m} \v_m. \]

Let \(\v_1 = \left[\begin{matrix} 1 \\ 0 \\ 1 \end{matrix}\right]\), \(\v_2 = \left[\begin{matrix} 0 \\ 1 \\ 0 \end{matrix}\right]\), and \(\v_3 = \left[\begin{matrix} 0 \\ 0 \\ 1 \end{matrix}\right]\). The set \(\B = \set{\v_1, \v_2, \v_3}\) is a basis for \(\R^3\). Let \(\x = \left[\begin{matrix} 1 \\ 0 \\ 0 \end{matrix}\right]\). Calculate

\[ \frac{\x\cdot\v_1}{\v_1\cdot\v_1} \v_1 + \frac{\x\cdot\v_2}{\v_2\cdot\v_2} \v_2 + \frac{\x\cdot\v_3}{\v_3\cdot\v_3} \v_3. \]

Compare to \(\x\). Does this violate Theorem 28.4? Explain.

An **orthonormal basis** \(\B = \set{\u_1, \u_2, \ldots,\u_m}\) for a subspace \(W\) of \(\R^n\) is an orthogonal basis such that \(\norm{\u_k} = 1\) for \(1\le k\le m\).

- Let \(\v_1, \v_2\) be orthogonal vectors. Explain how we can obtain unit vectors \(\u_1,\u_2\) in the direction of \(\v_1,\v_2\), respectively.
- Show that \(\u_1, \u_2\) from the previous part are orthogonal.
- Use these ideas to construct an orthonormal basis for \(\R^3\) from the orthogonal basis

\[ S = \set{\left[\begin{matrix} -2 \\ 1 \\ -1 \end{matrix}\right], \left[\begin{matrix} 0 \\ 1 \\ 1 \end{matrix}\right], \left[\begin{matrix} 1 \\ 1 \\ -1 \end{matrix}\right]}. \]

Let \(\u_1 = \frac{1}{3} [2 \ 1 \ 2]^\textsf{T}, \u_2 = \frac{1}{3} [-2 \ 2 \ 1]^\textsf{T}, \u_3 = \frac{1}{3} [1 \ 2\ -2]^\textsf{T}\). It is not difficult to see that the set \(\set{\u_1, \u_2, \u_3}\) is an orthonormal basis for \(\R^3\). Let

\[ A = [\u_1 \ \u_2 \ \u_3] = \frac{1}{3} \left[\begin{matrix} 2 & -2 & 1\\ 1 & 2 & 2 \\ 2 & 1 & -2 \end{matrix}\right]. \]

- Use the definition of matrix multiplication to find the entries of the second row of \(A^\textsf{T} A\). Why should you have expected the result?
- With the result of the previous part in mind, what is the matrix product \(A^\textsf{T} A\)? What does this tell us about the relationship between \(A^\textsf{T}\) and \(A^{-1}\)? Use technology to calculate \(A^{-1}\) and confirm your answer.
- Suppose \(P\) is an \(n\times n\) matrix whose columns form an orthonormal basis for \(\R^n\). Explain why \(P^T P = I_n\). Such matrices are called
**orthogonal**^{1}.