Matrix Algebra Glossary
Linear subspace
If $L$ is a set of vectors in $\R^n$, then
$L$ is a linear subspace of $\R^n$ if
- for any two vectors $\vu\in L, \vv\in L$ the sum $\vu+\vv$ also
belongs to $L$, and
- for any vector $\vv\in L$ and any number $c$, the vector
$c\vv$ also belongs to $L$.
To span
Vectors $\vv_1, \cdots, \vv_k$ span a
linear subspace $L$ of $\R^n$ if every vector in $L$ is a linear
combination of the vectors $\vv_1, \cdots, \vv_k$.
Linear independence
Vectors $\vv_1, \cdots, \vv_k$ are linearly
independent if the only solution to
\[
c_1\vv_1 + \cdots + c_k\vv_k = 0
\]
is
\[
c_1=\cdots=c_k=0.
\]
If $\vv_1, \dots, \vv_k$ are linearly independent, then any vector $\vx$ can
be written in at most one way as a linear combination
\[
\vx=x_1 \vv_1+ \cdots + x_k\vv_k
\]
i.e.
\[
x_1\vv_1 + \cdots + x_k\vv_k = \bar x_1\vv_1 + \cdots + \bar x_k\vv_k
\]
implies $x_1=\bar x_1$, … , $x_k=\bar x_k$.
Basis
Vectors $\vv_1$, …, $\vv_n$ form a
basis for a linear subspace $L$ if they are linearly
independent, and if they span $L$.
Dimension
The dimension of a linear subspace $L$ of $\R^n$ is
the number of vectors in a basis for $L$.
Note: in mathematical usage this word is always in the singular: you
say “the dimension of $L$ is 3,” but not “$L$ has
three dimensions.”
Solution space
The solution space of a set of homogeneous linear equations $A\vx=\vzero$ is the
set of all vectors $\vx$ that satisfy the equation. The solution space is
always a linear subspace of $\R^n$. This notion is only used for homogeneous
equations.
Eigenvalues and vectors
Let $A$ be an $n\times n$ matrix.
Definition. $\vv$ is an eigenvector for $A$ with eigenvalue $\lambda$ if
$\vv\neq 0$ and $A\vv=\lambda \vv$.
Theorem. The eigenvalues are the roots of the characteristic polynomial
$\det\bigl(A-\lambda I\bigr)$, i.e. they are the solutions of the equation $\det\bigl(A-\lambda I\bigr)=0$.
Theorem. Let $1\leq k\leq n$.
If $\vv_1$, …, $\vv_k$ are eigenvectors for a matrix $A$ whose corresponding
eigenvalues $\lambda_1$, …, $\lambda_k$, are pairwise distinct, then
$\{\vv_1, \ldots, \vv_k\}$ is linearly independent.
If the $n\times n$ matrix $A$ has $n$ distinct
eigenvalues then any set of corresponding eigenvectors $\{\vv_1,
\vv_2, \ldots, \vv_n\}$ is a basis for $\R^n$.
Eigenvectors of symmetric matrices
Eigenvectors with different eigenvalues are normally not perpendicular to each
other. However, there is one important special case in which this does happen:
Theorem. If the matrix $A$ is symmetric and if $\vv$ and $\vw$ are
eigenvectors whose corresponding eigenvalues $\lambda$ and $\mu$ are different,
then $\vv\perp \vw$.
For every symmetric matrix $A$ one can find a set of
eigenvectors $\{\vv_1, \ldots , \vv_n\}$ that forms an orthonormal
basis of $\R^n$.
Miscellaneous
General Solution (for linear equations)
The general solution of a linear system of equations
$A\vx=\vb$ is a formula containing a number of parameters such that
any choice of the parameters gives you a solution to $A\vx=\vb$, and
such that every solution can be found by choosing appropriate values
of the parameters.
General Solution (for a differential equation)
The general solution of a linear differential
equation
\[
y^{(n)}(t) + p_1(t)y^{(n-1)}(t) + \cdots + p_{n-1}(t) y'(t) + p_n(t)
y(t)= f(t)
\]
or
\[
\vx'(t) = A\vx(t) + \vf(t)
\]
is a formula containing a number of parameters (usually called $c_1$,
$c_2$, etc.) such that any choice of the parameters gives you a
solution to the equation, and such that every solution can be found by
choosing appropriate values of the parameters.
A Formula is not an Equation
Sadly, the words formula and equation are frequently confused.
Formula
A group of symbols representing a mathematical
object. E.g. “$2+3$,” or “$\sqrt{x^2+1}$,” or
“$y''(t)+\sin(t)y(t)$”.
Equation
An equation equates. An equation is a group of symbols
expressing that two formulas are equal. E.g. “$(x-1)(x+1) =
x^2-1$,” or “$2x+y=17$.”
An equation always contains an equality sign. It is incorrect to call
“$2x+3$” an equation; “$2x+3$” is a formula.