Notes - Linear Algebra MT23, Adjoints


Flashcards

Given an inner product space $V$ with associated natural isomorphism $\phi : V \to V’$, define the adjoint $T^\ast$ of a linear transformation $T$ in two different ways: first, by drawing a commutative diagram and deducing a formula for $T^\ast$, and second, more concretely in terms of the inner product.


Commutative diagram, two $V$s at the top, two $V’$s at the bottom, $T$ linking the $V$s, $T’$ linking the $V’$s, and then finally $T^\ast$ pointing in an opposite direction to $T$.

Then $T^\star = \phi^{-1} \circ T’ \circ \phi$. More concretly, $T^\ast$ is the linear map satisfying

\[\langle T^\ast v, w \rangle = \langle v, Tw \rangle\]

$\forall v, w \in V$.

Suppose $T : V \to V$ is a linear transformation over a finite dimensional inner product space. Quickly prove that $T^\star$ exists and is linear.


The adjoint satisfies the property

\[\forall v, w \in V \quad \langle T^\ast v, w \rangle = \langle v, Tw \rangle\]

Consider $S : V \to V’$ given by $v \mapsto f _ v$ where $f _ v(w) := \langle v, T(w) \rangle$. Then $f _ v$ is a linear functional.

Since $V$ and $V’$ are finite dimensional, they are isomorphic under the map $\phi : V \to V’$ given by

\[\phi(v) = \langle v, \cdot \rangle\]

Hence given any linear functional, we can put them all in the same “form”, where it’s just the inner product with a specific vector in the first argument. In this case $f$, we can find some $u$ such that

\[f_v = \langle u, \cdot \rangle\]

Define $T^\star(v) = u$. Then

\[\langle v, T(w) \rangle = \langle u, w\rangle = \langle T^\star v, w\rangle\]

for all $v, w \in V$. So it satisfies the required properties. To see $T^\star$ is linear, note that

\[\begin{aligned} \langle T^\star (\lambda v_1 + v_2), w \rangle &= \langle \lambda v_1 + v_2, Tw \rangle \\\\ &= \overline \lambda\langle v_1, Tw\rangle + \langle v_2, Tw \rangle \\\\ &= \langle \lambda T^\star v_1, w \rangle + \langle T^\star v_2, w\rangle \\\\ &= \langle \lambda T^\star v_1 + T^\star v_2, w \rangle \end{aligned}\]

Then, since the inner product is non-degenerate, subtracting the RHS from the LHS yields that

\[T^\star(\lambda v_1 + v_2) = \lambda T^\star v_1 + T^\star v_2\]

Suppose $T : V \to V$ is a linear map and $\mathcal B = \{e _ 1, \cdots, e _ n\}$ is an orthonormal basis for $V$. Then how can you more conceretly describe the adjoint $T^\ast$ where $T$ is represented by a matrix $A$ with respect to this basis?


\[A^\ast = \overline A^\intercal\]

Suppose $V$ is finite dimensional and $S, T$ are linear maps. What is

\[(S + T)^\ast\]

?


\[S^\ast + T^\ast\]

Suppose $V$ is finite dimensional and $S, T$ are linear maps. What is

\[(ST)^\ast\]

?


\[T^\ast S^\ast\]

Suppose $V$ is finite dimensional and $T$ is a linear map. What is

\[(\lambda T)^\ast\]

?


\[\overline\lambda T^\ast\]

Suppose $V$ is finite dimensional and $T$ is a linear map. What is

\[(T^\ast)^\ast\]

?


\[T\]

Suppose $V$ is finite dimensional and $T$ is a linear map. What is

\[m_{T^\ast}\]

?


\[\overline{m_T}\]

What does it mean for a linear map $T : V \to V$ to be self-adjoint?


\[T = T^\ast\]

Suppose $T : V \to V$ is a self-adjoint linear map. What can you say about the eigenvalues $\lambda$ of $T$?


\[\lambda \in \mathbb R\]

Suppose $T : V \to V$ is a self-adjoint linear map. Quickly prove that if $\lambda$ is an eigenvalue of $T$, then $\lambda$ is real.


Assume $w \ne 0$ and $T(w) = \lambda w$ for some $\lambda \in \mathbb C$. Then

\[\begin{aligned} \lambda \langle w, w \rangle &= \langle w, \lambda w \rangle \\\\ &= \langle w, Tw \rangle \\\\ &= \langle T^\ast(w), w \rangle \\\\ &= \langle T(w), w \rangle \\\\ &= \langle \lambda w, w \rangle \\\\ &= \overline \lambda \langle w, w\rangle \end{aligned}\]

Suppose $T : V \to V$ is a self-adjoint linear map. If $U \subseteq V$ is $T$-invariant, what can you say about $U^\perp$?


It is also $T$-invariant.

Suppose $T : V \to V$ is a self-adjoint linear map. Quickly prove that if $U \subseteq V$ is $T$-invariant, then $U^\perp$ is also $T$-invariant.


Let $w \in U^\perp$. Then $\forall u \in U$,

\[\langle u, T(w)\rangle = \langle T^\ast(u), w \rangle = \langle T(u), w\rangle = 0\]

as $T(u) \in U$ and $w \in U^\perp$. Hence $T(w) \in U^\perp$.

Suppose $T : V \to V$ is a self-adjoint linear map and $V$ is finite-dimensional. Can you state a “spectral theorem”?


There exists an orthonormal basis of $V$ consisting of eigenvectors for $T$.

Quickly prove that

\[\langle T^\ast v, w \rangle = \langle v, Tw \rangle \quad \forall v, w \in V\]

specifies $T^\ast$ uniquely.


Suppose $B$, $C$ are two choices for $T^\ast$. Then

\[\begin{aligned} \langle Bv - Cv, w \rangle &= \langle Bv, w\rangle - \langle Cv, w\rangle \\\\ &= \langle v, Tw \rangle - \langle v, Tw \rangle \\\\ &= 0 \end{aligned}\]

then by non-degeneracy, we must have $B = C$.

Quickly prove that if $T : V \to V$ is a self-adjoint linear map, then there exists an orthonormal basis of $V$ consisting of eigenvectors of $T$.


Let $v$ be an eigenvector of $T$. Consider $U = \langle v \rangle$. Then $U$ is $T$-invariant. This implies that $U^\perp$ is also $T$-invariant, so the map

\[T |_{U^\perp} : U^\perp \to U^\perp\]

is well defined. Since $U^\perp$ has dimension $n - 1$, we can assume by induction that it has an orthonormal basis of eigenvectors $\{e _ 2, \ldots, e _ n\}$. These along with $\frac{v}{ \vert \vert v \vert \vert }$ gives an orthonormal basis for all of $T$.

Suppose $T : V \to V$ is linear. Given a basis $\mathcal B$, what is a sufficient condition to ensure that

\[{}_{\mathcal B}[T^\star]_{\mathcal B} = {}_{\mathcal B}[\overline T^\top]_{\mathcal B}\]

?


$\mathcal B$ is orthonormal.

Quickly prove that if:

  • $T : V \to V$ linear
  • $\mathcal B = [e _ 1, \cdots, e _ n]$ orthonormal basis for $V$

Then

\[{}_{\mathcal B}[T^\star]_{\mathcal B} = {}_{\mathcal B}[\overline T^\top]_{\mathcal B}\]

Let $A = {} _ {\mathcal B}[T] _ {\mathcal B}$ and $B = {} _ {\mathcal B} [T^\star] _ {\mathcal B}$. We aim to show $b _ {ij} = \overline{ a _ {ji} \,}$. Note that

\[\begin{aligned} \langle e_i, Te_j \rangle &= \langle e_i, \sum^n_{k = 1} a_{kj} e_k \rangle \\\\ &= a_{ij} \end{aligned}\]

So this expression lets us “extract coordinates” from the corresponding matrix. Applying this to either side:

\[\begin{aligned} b_{ij} &= \langle e_i, T^\star(e_j) \rangle \\\\ &= \langle T(e_i), e_j\rangle \\\\ &= \overline{\langle e_j, T(e_i)\rangle} \\\\ &= \overline{a_{ji}\\,} \end{aligned}\]

Quickly justify that if

  • $T : V \to V$ is linear
  • $\mathcal B = [e _ 1, \cdots, e _ n]$ is an orthonormal basis for $V$
  • $A = {} _ {\mathcal B} [T] _ {\mathcal B}$

Then

\[a_{ij} = \langle e_i, T(e_j) \rangle\]

and state the theorem that this is a useful result for.


\[\langle e_i, T(e_j) \rangle = \left\langle e_i, \sum^n_{k=1}a_{kj} e_k \right\rangle = a_{ij }\langle e_i, e_i\rangle = a_{ij}\]

This is useful for showing, under the above assumptions,

\[{}_{\mathcal B}[T^\star]_{\mathcal B} = {}_{\mathcal B}[\overline T^\top]_{\mathcal B}\]

State what it means for a matrix $T$ to be normal, and quickly prove that this implies $\ker (T - \lambda I)^\perp$ is $T$-invariant, also proving the mini lemma that lets this happen (to do with preservation of eigenspaces for commuting maps)?


Being normal implies

\[T^\star T = T T^\star\]

Suppose $v \in \ker(T - \lambda I)^\perp$. Then $\forall u \in \ker(T - \lambda I)$

\[\langle u, v\rangle = 0\]

Now consider whether $Tv$ has this property. Fix $u \in \ker(T - \lambda I)$, then

\[\langle u, Tv \rangle = \langle T^\star u, v \rangle = \langle u', v \rangle = 0\]

where $u’$ is in $\ker(T - \lambda I)$. This is because commuting matrices preserve eachother’s eigenspaces, in this specific case if $w \in \ker(T - \lambda I)$, then

\[T(T^\star w) = T^\star (Tw) = T^\star \lambda w = \lambda (T^\star w)\]

so $T^\star w \in \ker(T - \lambda I)$.

Suppose:

  • $V$ is a finite dimensional inner product space over $\mathbb C$
  • $T : V \to V$ is a linear map
  • $T$ is positive, i.e. $\forall v \ne 0$, $\langle v, Tv \rangle \in \mathbb R _ {> 0}$.

Quickly prove that $T$ is self-adjoint.


Let $S = T - T^\star$. Then $S^\star = -S$, so defining $D = iS$ gives a self-adjoint linear map since $D^\star = -iS^\star = iS$.

Let $\lambda$ be an eigenvalue of $D$ with corresponding eigenvector $v$. Then since $D$ is self-adjoint, $\lambda$ must be real. Hence

\[\begin{aligned} \lambda |v|^2 &= \langle v, Dv\rangle \\\\ &= i\langle(-\langle Tv, v \rangle + \langle v, Tv\rangle) \\\\ \end{aligned}\]

but this must imply $\lambda = 0$, since $\lambda$ is real and both $\langle Tv, v\rangle$ and $\langle v, Tv\rangle$ are also real. Hence all eigenvalues of $D$ are $0$ and because it is self-adjoint, there is a basis in which it is $\text{diag}(0, \cdots, 0)$, so in fact $D = 0$. Then $T = T^\star$.

Suppose we have deduced that a particular matrix $A$ satisfies

\[\langle v, Av \rangle = 0 \quad \forall v \in V\]

It doesn’t quite follow that $A = 0$ by just non-degeneracy, since for that we would require that

\[\langle u, Av\rangle = 0 \quad \forall u, v \in V\]

How can you deduce that in this case we still actually have $A = 0$?


Consider applying the original condition to the vector $u + w$ for some $u, w$, i.e.

\[\langle u + w, A(u + w) \rangle = 0\]

Then, expanding

\[\langle u, Au\rangle + \langle u, Aw \rangle + \langle w, Au \rangle + \langle w, Aw\rangle = 0\]

Using the original condition again, we can deduce that

\[\langle u, Aw \rangle + \langle w, Au\rangle = 0\]

Applying the same process to $u + iw$, we get

\[i \langle u, Aw\rangle - i\langle w, Au\rangle = 0\]

Solving this system simultaneously, we get that

\[\langle u, Aw \rangle = 0\]

which implies $A = 0$ by non-degeneracy.

Suppose $T$ is a normal matrix, i.e. $T^\star T = T T^\star$. What relationship is there between $\ker T$, $\ker T^\star$, $\text{Im }T$, $\text{Im } T^\star$?


  • $\ker T = \ker T^\star$
  • $\text{Im } T$ = $\text{Im }T^\star$

Suppose $T$ is a normal matrix, i.e. $T^\star T = T T^\star$. Quickly prove the following relationship between $\ker T$, $\ker T^\star$, $\text{Im }T$, $\text{Im } T^\star$:

  • $\ker T = \ker T^\star$
  • $\text{Im } T$ = $\text{Im }T^\star$

$\ker T = \ker T^\star$:

\[\begin{aligned} v \in \ker T &\iff Tv = 0 \\\\ &\iff ||Tv||= 0 \\\\ &\iff ||T^\star v|| = 0 \\\\ &\iff T^\star v = 0 \\\\ &\iff v \in \ker T^\star \end{aligned}\]

where in the step $ \vert \vert Tv \vert \vert = 0 \iff \vert \vert T^\star v \vert \vert = 0$ we use the assumption that $T$ is normal.

$\text{Im } T = \text{Im }T^\star$: A bit trickier, consider that

\[V = (\ker T) \oplus (\ker T)^\perp\]

and by standard results about adjoints, both $\ker T$ and $(\ker T)^\perp$ are $T$ and $T^\star$ invariant. Furthermore, note that $T$ and $T^\star$ are both bijective when restricted to $(\ker T)^\perp$. Then

\[T(V) = T((\ker T) \oplus (\ker T)^\perp) = T((\ker T)^\perp) = (\ker T)^\perp\]

and likewise

\[T^\star(V) = T^\star((\ker T) \oplus (\ker T)^\perp) = T^\star((\ker T)^\perp) = (\ker T)^\perp\]

Hence

\[\text{Im } T = \text{Im }T^\star = (\ker T)^\perp\]

Can you give a characterisation that $T$ is normal in terms of norms?


\[T \text{ normal} \iff ||T(v)|| = ||T^\star(v)|| \quad \forall v \in V\]



Related posts