Proof that if a non-zero vector is orthogonal to a subspace then it is not in the subspace

We consider a non-zero vector v and a matrix U with set of column vectors is a basis C(U), v is orthogonal to U. Now we prove that column vectors of  A = \begin{bmatrix} U & v \end{bmatrix} are linearly independent. If column vectors of A are linearly independent then A^{T}A is invertible (which means N(A^{T}A) = Z). We have: A^{T}A = \begin{bmatrix} U^{T}U & U^{T}v\\ v^{T}U & v^{T}v \end{bmatrix} = \begin{bmatrix} U^{T}U & 0\\ 0 & v^{T}v \end{bmatrix}. It’s easy to see that N(A^{T}A) = N(\begin{bmatrix} U^{T}U & 0 \end{bmatrix}) \cap N(\begin{bmatrix} 0 & v^{T}v \end{bmatrix}) = Z ( U^{T}U is invertible because U has linearly independent column vectors). So the statement follows.


This leads to the fact that if two subspaces U \subseteq \mathbb{R}^{n} and V \subseteq \mathbb{R}^{n} are orthogonal then dim U + dim V \leq n (it’s easy to prove that vectors in basis of U and vectors in basis of V are linearly independent, hence basis of U unions with basis of V will be a basis of a subspace of \mathbb{R}^{n} and the dimension of this subspace cannot exceed the dimension of \mathbb{R}^{n}).

If U and V are orthogonal and dim U + dim V = n then U is orthogonal complement of V and vice versa, which means U contains all the vectors that are orthogonal to V and vice versa. The idea of the proof: t \in \mathbb{R}^{n} is orthogonal to U and t is not in V then basis of V unions with { t } produces subspace T with dimension dim T = dim V + 1, hence leads to contradiction: dim T + dim U = dim U + dim V + 1 > n.

If U contains all the vectors that are orthogonal to V then dim U + dim V = n. Proof: U always has basis  \begin{bmatrix} U'\\I \end{bmatrix} (U' is an n - dim U by dim U matrix), V always has basis  \begin{bmatrix} I\\V' \end{bmatrix} (V' is an n - dim V by dim V matrix), I’m always able to find a vector m \in \mathbb{R}^{n} such that its first dim V components are negations of first row components of V', its (dim V + 1)th component is 1, the rest (n - dim V - 1 components) are 0, so m is orthogonal to the subspace V. Due to the fact that dim U + dim V < n \Rightarrow dim U \leq n - dim V - 1, hence  \begin{bmatrix} U'\\I \end{bmatrix}x = m has trivial solution because Ix = 0 has trivial solution (the I block has size dim U by dim U, so its number of rows is less than n - dim V - 1, while the last n - dim V - 1 components of m are 0), so m is not in the subspace U as it is supposed to be. So if  dim U + dim V < n, there exists a vector m \notin U that is orthogonal to V, so U does not contain all the vectors that are orthogonal to V. So if U contains all the vectors that are orthogonal to V then dim U + dim V \geq n, however we have proved above that dim U + dim V \leq n, hence: If U contains all the vectors that are orthogonal to V then dim U + dim V = n.

I guess that the above results can be generalized to any vector space.


Facebooktwittergoogle_plusredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published. Required fields are marked *