# Proof that if a non-zero vector is orthogonal to a subspace then it is not in the subspace

We consider a non-zero vector and a matrix with set of column vectors is a basis , is orthogonal to . Now we prove that column vectors of are linearly independent. If column vectors of are linearly independent then is invertible (which means ). We have: . It’s easy to see that ( is invertible because has linearly independent column vectors). So the statement follows.

This leads to the fact that if two subspaces and are orthogonal then (it’s easy to prove that vectors in basis of and vectors in basis of are linearly independent, hence basis of unions with basis of will be a basis of a subspace of and the dimension of this subspace cannot exceed the dimension of ).

**If and are orthogonal and then is orthogonal complement of and vice versa, which means contains all the vectors that are orthogonal to and vice versa.** The idea of the proof: is orthogonal to and is not in then basis of unions with produces subspace with dimension , hence leads to contradiction: .

**If contains all the vectors that are orthogonal to then .** Proof: always has basis ( is an by matrix), always has basis ( is an by matrix), I’m always able to find a vector such that its first components are negations of first row components of , its th component is , the rest ( components) are , so is orthogonal to the subspace . Due to the fact that , hence has trivial solution because has trivial solution (the block has size by , so its number of rows is less than , while the last components of are ), so is not in the subspace as it is supposed to be. So if , there exists a vector that is orthogonal to , so does not contain all the vectors that are orthogonal to . So if contains all the vectors that are orthogonal to then , however we have proved above that , hence: If contains all the vectors that are orthogonal to then .

I **guess** that the above results can be generalized to any vector space.