Linear systems with Complex Eigenvalues
Complex vectors
Definition
When the matrix $A$ of a system of linear differential equations
\begin{equation}
\dot\vx = A\vx
\label{eq:linear-system}
\end{equation}
has complex eigenvalues the most convenient way to represent the real solutions
is to use complex vectors. A complex vector is a column vector
\[
\vv =
\begin{bmatrix}
v_1\\\vdots\\v_n
\end{bmatrix}
\]
whose entries $v_k$ are complex numbers. Every complex vector can be written as
$\vv = \va+i\vb$ where $\va$ and $\vb$ are real vectors. To do this write each
entry $v_k=a_k+ib_k,$ with $a_k$ and $b_k$ the real and imaginary parts of
$v_k$, and split the vector $\vv$ as follows:
\begin{equation}
\vv
=
\begin{bmatrix}
v_1\\ \vdots\\ v_n
\end{bmatrix}
=
\begin{bmatrix}
a_1+ib_1\\ \vdots\\ a_n+ib_n
\end{bmatrix}
=
\begin{bmatrix}
a_1\\ \vdots\\ a_n
\end{bmatrix}
+ i
\begin{bmatrix}
b_1\\ \vdots\\ b_n
\end{bmatrix}
\end{equation}
The vectors $\va$ and $\vb$ are the real and complex parts of the vector
$\vv=\va+i\vb$.
Complex conjugate
Every complex number $z=a+bi$ has a complex conjugate,
\[
\bar z = a-bi.
\]
Addition and multiplication of complex numbers behaves nicely with respect to
complex conjugation:
\begin{equation}
\overline{z+w} = \bar z + \bar w, \qquad
\overline{zw} = \bar z\, \bar w,\qquad
\overline{\left(\frac zw\right)} = \frac{\bar z}{\bar w}.
\end{equation}
The length or absolute value $|z| = \sqrt{a^2+b^2}$ of a complex number $z=a+bi$
satisfies
\begin{equation}
|z|^2 = z\bar z, \qquad \frac{1}{z} = \frac{\bar z}{|z|^2}.
\end{equation}
The following formulas allow us to recover the real and imaginary parts of a
complex number $z=a+bi$ from $z$ and its complex conjugate:
\begin{equation}
z=a+bi \iff a=\frac{z+\bar z}{2} \text{ and } \quad b=\frac{z-\bar z}{2i}.
\end{equation}
By analogy we define the complex conjugate of a complex vector $\vv=\va+i\vb$ to
be
\[
\bar \vv = \overline{\va+i\vb} = \va - i\vb.
\]
Then we have
\begin{equation}
\overline{\vv+\vw} = \bar\vv + \bar \vw,\qquad
\overline{z\vv} = \bar z \, \bar \vv
\end{equation}
for any complex vectors $\vv$, $\vw$, and any complex number $z$.
Complex conjugates of complex exponentials
By definition the exponential of a complex number $z=a+bi$ is
\[
e^{a+bi} = e^{a} \bigl(\cos b + i \sin b\bigr).
\]
Replacing $b$ by $-b$, and using that $\cos(-b) = \cos b$, $\sin(-b)=-\sin b$,
leads to
\[
e^{a-bi} = e^{a} \bigl(\cos b - i \sin b\bigr).
\]
Thus for any complex number $z=a+bi$ one has
\[
e^{\bar z} = \overline{e^z}.
\]
Real matrices and complex vectors
Multiplying complex matrices
If $A$ is any matrix, possibly with complex entries, and $\vv$ is a complex
vector, then one defines the matrix product $A\vv$ by exactly the same formula
as in the real case, namely,
\[
\begin{bmatrix}
a_{11} & \dots & a_{1n} \\
\dots & & \vdots \\
a_{n1} & \dots & a_{nn}
\end{bmatrix}
\begin{bmatrix}
v_1 \\ \vdots \\ v_n
\end{bmatrix}
=
\begin{bmatrix}
a_{11}v_1 + \cdots + a_{1n}v_n \\
\vdots \\
a_{n1}v_1 + \cdots + a_{nn}v_n
\end{bmatrix}.
\]
If $A$ is a real matrix, i.e. a matrix all of whose entries $a_{kl}$ are
real numbers, and if the complex vector is $\vv = \va+i\vb$, then the matrix
product $A\vv$ satisfies
\[
A\vv = A(\va+i\vb) =A\va + i A\vb.
\]
From this it follows that
\[
\overline{A\vv} = A\bar\vv
\]
Complex eigenvalues and eigenvectors of real matrices
Let $A$ be a real matrix. If $\lambda$ is a complex eigenvalue of $A$, then the
corresponding eigenvectors will also be complex.
Theorem. If $\lambda$ is a complex eigenvalue of the real matrix
$A$, and if $\vv$ is a corresponding complex eigenvector, then $\bar \lambda$
is also an eigenvalue, and
\[
A\bar \vv = \bar \lambda\, \bar\vv,
\]
i.e. $\bar\vv$ is an eigenvector corresponding to the eigenvalue $\bar
\lambda$.
Proof. This follows from
\(
A\bar\vv = \overline{A\vv} = \overline{\lambda\vv} = \bar \lambda \, \bar\vv.
\)
q.e.d.
Exponentials of matrices with complex eigenvalues
The basic example
Consider the matrix
\[
J=
\begin{bmatrix}
0 & 1 \\ -1 & 0
\end{bmatrix}.
\]
The eigenvalues of this matrix are $\lambda_\pm = \pm i$, and the corresponding eigenvectors are
\[
\vv=
\begin{bmatrix}
1 \\ i
\end{bmatrix}
=
\begin{bmatrix}
1 \\ 0
\end{bmatrix}
+ i
\begin{bmatrix}
0 \\ 1
\end{bmatrix}
\text{ and }
\bar\vv =
\begin{bmatrix}
1 \\ - i
\end{bmatrix}
=
\begin{bmatrix}
1 \\ 0
\end{bmatrix}
- i
\begin{bmatrix}
0 \\ 1
\end{bmatrix}.
\]
(Check for yourself that $J\vv = i\vv$ and $J\bar\vv = -i \bar \vv$.)
The matrix $J$ satisfies
\[
J^2 = - I,\quad J^3=-J,\quad J^4=I, \ldots
\]
and thus we have
\begin{align*}
e^{tJ} &= I + tJ +\frac{t^2}{2!}J^2+ \frac{t^3} {3!}J^3+\cdots\\
&= I + tJ -\frac{t^2}{2!}I - \frac{t^3} {3!}J+\cdots\\
&= \Bigl(1-\frac{t^2} {2!} + \frac{t^4} {4!} - \cdots\Bigr)I
+\Bigl(t-\frac{t^3} {3!}+\cdots\Bigr)J \\[1ex]
&= \cos(t)I + \sin(t)J
\end{align*}
so that
\[
e^{tJ} =
\begin{bmatrix}
\cos t & \sin t \\ - \sin t & \cos t
\end{bmatrix}
\]
Second example
The eigenvalues of the matrix
\[
A =
\begin{bmatrix}
\alpha & \omega \\ - \omega & \alpha
\end{bmatrix}
\]
are $\lambda=\alpha+i\omega$ and $\bar\lambda=\alpha-i \omega$.
The corresponding eigenvectors are again
\[
\vv =
\begin{bmatrix}
1 \\ i
\end{bmatrix}
=
\begin{bmatrix}
1 \\ 0
\end{bmatrix}
+ i
\begin{bmatrix}
0 \\ 1
\end{bmatrix}
\text{ and }
\bar\vv =
\begin{bmatrix}
1 \\ - i
\end{bmatrix}
=
\begin{bmatrix}
1 \\ 0
\end{bmatrix}
- i
\begin{bmatrix}
0 \\ 1
\end{bmatrix}.
\]
For this matrix we can use the previous example to compute $e^{tA}$.
Write
\[
A = \alpha I + \omega J, \qquad
I =
\begin{bmatrix}
1 &0 \\ 0 &1
\end{bmatrix}, \quad
J =
\begin{bmatrix}
0 & 1 \\ -1 & 0
\end{bmatrix}.
\]
The matrices $I$ and $J$ commute, so $e^{tA} = e^{\alpha tI+\omega tJ} =
e^{\alpha tI} e^{\omega tJ} $. We have
\[
e^{\alpha tI} =
\begin{bmatrix}
e^{\alpha t} & 0 \\ 0 & e^{\alpha t}
\end{bmatrix}, \qquad
e^{\omega tJ} =
\begin{bmatrix}
\cos \omega t & \sin \omega t \\
-\sin \omega t & \cos \omega t
\end{bmatrix}
\]
and thus
\[
e^{tA} =
e^{\alpha t}\begin{bmatrix}
\cos \omega t & \sin \omega t \\
-\sin \omega t & \cos \omega t
\end{bmatrix}
=
\begin{bmatrix}
e^{\alpha t}\cos \omega t & e^{\alpha t}\sin \omega t \\
-e^{\alpha t}\sin \omega t & e^{\alpha t}\cos \omega t
\end{bmatrix}.
\]
Diagonalizing complex matrices
Suppose that $A$ has $\ell$ real eigenvalues $\lambda_1, \ldots,
\lambda_\ell$ and $k$ pairs of complex eigenvalues $ \alpha_1 \pm
i\omega_1$, $\dots$, $\alpha_k\pm i\omega_k$, where we assume that
$\omega_1, \dots, \omega_k\gt0$. Let $\vv_1$, … , $\vv_\ell$
be real eigenvectors corresponding to the real eigenvalues, and let
$\vw_1$, $\bar\vw_1$, … , $\vw_k$, $\bar\vw_k$, be the complex
eigenvectors that go with the complex eigenvalues. We write these
complex eigenvectors as $\vw_j = \va_j+i\vb_j$, $\bar\vw_j = \va_j -
i\vb_j$, and we define the matrix
\[
V =
\begin{bmatrix}
| & | & & | & | & | & & | & |\\
\vv_1 & \vv_2 & \cdots & \vv_\ell & \va_1 & \vb_1 & \cdots& \va_k & \vb_k\\
| & | & & | & | & | & & | & |\\
\end{bmatrix}
\]
Lemma. $AV = VD$ where $D$ is the
not–quite–diagonal matrix
\[
D =
\begin{pmatrix}
\lambda_1 & & & & & & \\
& \ddots\\
& & \lambda_\ell & & & & & \\
& & & \alpha_1 & \omega_1 & & \\
& & & -\omega_1 & \alpha_1 & & \\
& & & & & \ddots & \\
& & & & & & \alpha_k & \omega_k \\
& & & & & & - \omega_k & \alpha_k \\
\end{pmatrix}
\]
Proof
First consider the simplest case $\ell=0$, $k=1$, i.e. the case where $A$ is a real
$2\times 2$ matrix with a complex eigenvalue $\alpha+ i\omega$ and eigenvectors
$\vw = \va+i\vb$. It then follows from $A\vw=\lambda\vw$ that
\[
A\va + iA\vb = A\bigl(\va+i\vb\bigr) = (\alpha+i\omega)(\va+i\vb)
=\bigl(\alpha\va-\omega\vb\bigr) + i\bigl(\omega\va+\alpha\vb\bigr).
\]
Comparing real and imaginary parts we conclude
\[
A\va = \alpha\va-\omega\vb, \qquad
A\vb = \omega\va+\alpha\vb.
\]
If we now form the matrix $V=[\va\;\; \vb]$, and compute $AV$, then we get
\[
AV = A[\va\;\; \vb] = [A\va\;\; A\vb]
= [ \alpha\va-\omega\vb \;\; \omega\va+\alpha\vb]
= [\va\;\; \vb]
\begin{bmatrix}
\alpha & \omega \\ -\omega & \alpha
\end{bmatrix}
=VD,
\]
where
\[
D =
\begin{bmatrix}
\alpha & \omega \\ -\omega & \alpha
\end{bmatrix}.
\]
The general case for larger matrices can be handled similarly: if
\[
V=
\begin{bmatrix}
\vv_1 & \cdots & \vv_\ell & \va_1 & \vb_1 & \cdots& \va_k & \vb_k
\end{bmatrix}
\]
then
\begin{align*}
AV &=
\begin{bmatrix}
A\vv_1 & \cdots & A\vv_\ell & A\va_1 & A\vb_1 & \cdots& A\va_k & A\vb_k
\end{bmatrix}\\
&=\begin{bmatrix}
\lambda_1\vv_1 & \cdots & \lambda_\ell\vv_\ell &
\alpha_1\va_1-\omega\vb_1 & \omega_1\va_1+\alpha_1\vb_1 & \cdots&
\alpha_k\va_k-\omega_k\vb_k & \omega_k\va_k+\alpha_k\vb_k
\end{bmatrix}\\
&=
\begin{bmatrix}
\vv_1 & \cdots & \vv_\ell & \va_1 & \vb_1 & \cdots& \va_k & \vb_k
\end{bmatrix}
D\\
&=VD
\end{align*}
where $D$ is the matrix in the theorem.
q.e.d.
Computing $e^{tA}$
It follows from $AV=VD$ that $A = V D V^{-1}$, and therefore the matrix
exponential $e^{tA}$ is given by $e^{tA} = Ve^{tD}V^{-1}$, so we must first find
$e^{tD}$:
\[
e^{tD} =
\begin{pmatrix}
e^{\lambda_1t} & & & & & & \\
& \ddots\\
& & e^{\lambda_\ell t} & & & & & \\
& & & e^{\alpha_1 t}\cos\omega_1t & e^{\alpha_1 t}\sin \omega_1 t & & \\
& & & -e^{\alpha_1 t}\sin\omega_1 t& e^{\alpha_1 t}\cos\omega_1 t & & \\
& & & & & \ddots & \\
& & & & & & e^{\alpha_k t}\cos\omega_kt & e^{\alpha_k t}\sin \omega_k t \\
& & & & & & -e^{\alpha_k t}\sin\omega_k t& e^{\alpha_k t}\cos\omega_k t
\end{pmatrix}
\]
Solving $\dot\vx = A\vx$ without the matrix exponential
Let $A$ be an $n\times n$ matrix with real eigenvalues
\[
\lambda_m,\quad 1\le m\le\ell,
\]
and complex eigenvalues
\[
\mu_m=\alpha_m + i\omega_m,\quad
\bar\mu_m = \alpha_m-i\omega_m,
\qquad 1\le m \le k,
\]
and let the corresponding eigenvectors be
$\vv_m$ ($1\le m\le \ell$) and $\vw_m,\bar\vw_m$ ($1\le m\le k$).
Assume that $n=\ell+2k$ so that these are all the eigenvalues of $A$.
Theorem. The vectors $\{\vv_1, \dots, \vv_\ell,
\vw_1, \bar\vw_1, \dots, \vw_m, \bar\vw_m\}$ are linearly independent. If
$\vx$ is a real vector then it is of the form
\[
\vx = p_1\vv_1 + \cdots + p_\ell\vv_\ell +
\tfrac12q_1\vw_1 + \tfrac12\bar q_1\bar\vw_1 + \cdots + \tfrac12q_k\vw_k +
\tfrac12\bar q_k\bar\vw_k
\]
where $p_1$, …, $p_\ell$ are real and $q_1$, …, $q_k$ are complex
numbers.
The linear combination above is often written as
\[
\vx = p_1\vv_1 + \cdots + p_\ell\vv_\ell +
\Re \bigl\{q_1\vw_1 + \cdots + q_k\vw_k\bigr\}
\]
Proof
The vectors $\{\vv_1, \dots, \vv_\ell, \vw_1, \bar\vw_1, \dots, \vw_m,
\bar\vw_m\}$ are eigenvectors corresponding to different eigenvalues, and
therefore they are linearly independent. Since we have assumed that we have
exactly $n=\ell+2k$ eigenvalues, the vectors $\{\vv_1, \dots, \vv_\ell, \vw_1,
\bar\vw_1, \dots, \vw_m, \bar\vw_m\}$ also span $\C^n$, and thus they form a
basis. This means that every vector $\vx\in\C^n$ is a linear combination
\[
\vx = p_1\vv_1 + \cdots + p_\ell\vv_\ell +
\tfrac12 q_1\vw_1 + \tfrac12 r_1\bar \vw_1 + \cdots +
\tfrac12 q_k\vw_k + \tfrac12 r_k\bar \vw_k
\]
for certain complex numbers $p_m, q_m, r_m$.
The vector $\vx$ is real if and only if $\vx = \bar \vx$. Since
\[
\vx =
p_1\vv_1 + \cdots + p_\ell\vv_\ell +
\tfrac12 q_1\vw_1 + \tfrac12 r_1\bar \vw_1 + \cdots +
\tfrac12 q_k\vw_k + \tfrac12 r_k\bar \vw_k
\]
we have
\[
\bar\vx =
\bar p_1\vv_1 + \cdots + \bar p_\ell\vv_\ell +
\tfrac12 \bar q_1\bar \vw_1 + \tfrac12 \bar r_1\vw_1 + \cdots +
\tfrac12 \bar q_k\bar \vw_k + \tfrac12 \bar r_k\vw_k
\]
The vectors $\{\vv_1, \dots, \vv_\ell, \vw_1, \bar\vw_1, \dots, \vw_m,
\bar\vw_m\}$ are linearly independent, so $\vx=\bar\vx$ holds if and only if
\[
p_m=\bar p_m \quad(1\le m\le\ell), \text{ and }
r_m=\bar q_m \quad(1\le m\le k).
\]
Thus the coefficients of the real vectors $\vv_m$ must be real, while the
coefficients of the complex vectors $\{\vw_m, \bar \vw_m\}$ come in complex
conjugate pairs $q_m, \bar q_m$.
Since $\bar q_j\bar \vw_j$ is the complex conjugate of the vector $q_j\vw_j$,
one has
\[
q_j\vw_j + \bar q_j\bar\vw_j = 2\Re \bigl(q_j\vw_j\bigr)
\]
which implies the second form for $\vx$ in the theorem.
q.e.d.
Theorem. The general real valued solution of $\dot \vx=A\vx$ is
given by
\begin{align*}
\vx(t) &= p_1e^{\lambda_1t}\vv_1 + \cdots + p_\ell e^{\lambda_\ell}\vv_\ell
+ \Re\Bigl\{ q_1e^{\mu_1 t}\vw_1 + \cdots + q_k e^{\mu_kt}\vw_k\Bigr\} \\
&= p_1e^{\lambda_1t}\vv_1 + \cdots + p_\ell e^{\lambda_\ell}\vv_\ell
+ e^{\alpha_1 t}\Re\bigl\{ q_1e^{i\omega_1 t}\vw_1\bigr\}
+ \cdots
+ e^{\alpha_k t}\Re\bigl\{ q_ke^{i\omega_k t}\vw_k\bigr\}.
\end{align*}
Here the coefficients $p_j, q_j$ are determined by the initial value of the
solution through
\[
\vx(0) = p_1\vv_1 + \cdots + p_\ell\vv_\ell +
\Re \bigl\{q_1\vw_1 + \cdots + q_k\vw_k\bigr\}
\]
The second form of the solution shows that the terms in the solution
corresponding to the complex eigenvalues grow or decay according to the real
parts $\alpha_j$ of the complex eigenvalues, while they oscillate with
frequency $\omega_j$.
Proof
Each of the terms
\[
e^{\lambda_j t}\vv_j, \quad e^{\mu_jt}\vw_j, \quad e^{\bar \mu_j t}\bar\vw_j
\]
satisfies $\dot\vx = A\vx$, and therefore any linear combination, such as
\begin{align*}
\vx(t) =
p_1e^{\lambda_1t}\vv_1 &+ \cdots + p_\ell e^{\lambda_\ell t}\vv_\ell \\
& + \frac{q_1}{2}e^{\mu_1t}\vw_1
+ \frac{r_1}{2}e^{\bar\mu_1t}\bar\vw_1
+\cdots
+ \frac{q_k}{2}e^{\mu_kt}\vw_1
+ \frac{r_k}{2}e^{\bar\mu_kt}\bar\vw_1
\end{align*}
also is a solution of $\dot\vx=A\vx$.
If we choose $p_j\in\R$ and $q_j\in\C$ so that
\[
\vx(0) = p_1\vv_1 + \cdots + p_\ell\vv_\ell +
\Re \bigl\{q_1\vw_1 + \cdots + q_k\vw_k\bigr\}
\]
holds, and set $r_j=\bar q_j$, then we find that
\begin{align*}
\vx(t)
&=
p_1e^{\lambda_1t}\vv_1 + \cdots + p_\ell e^{\lambda_\ell t}\vv_\ell
+ \frac12 \Bigl\{q_1e^{\mu_1t}\vw_1
+ r_1e^{\bar\mu_1t}\bar\vw_1
+\cdots
+ q_ke^{\mu_kt}\vw_k
+ r_ke^{\bar\mu_kt}\bar\vw_k\Bigr\}\\
&=
p_1e^{\lambda_1t}\vv_1 + \cdots + p_\ell e^{\lambda_\ell t}\vv_\ell
+ \frac12 \Bigl\{q_1e^{\mu_1t}\vw_1
+ \overline{q_1e^{\mu_1t}\vw_1}
+\cdots
+ q_ke^{\mu_kt}\vw_k
+ \overline{q_ke^{\mu_kt}\vw_k}\Bigr\}\\
&=
p_1e^{\lambda_1t}\vv_1 + \cdots + p_\ell e^{\lambda_\ell t}\vv_\ell
+ \Re\Bigl\{ q_1e^{\mu_1 t}\vw_1 + \cdots + q_k e^{\mu_kt}\vw_k\Bigr\}.
\end{align*}
Using $\mu_j = \alpha_j + i\omega_j$ one can rewrite
\[
\Re\bigl\{q_j e^{\mu_j t}\vw_j\bigr\}
=
\Re\bigl\{q_j e^{\alpha_j t}e^{i\omega_j t}\vw_j\bigr\}
=
e^{\alpha_j t}\Re\bigl\{q_j e^{i\omega_j t}\vw_j\bigr\},
\]
because $e^{\alpha_jt}$ is real.
q.e.d.