Introduction

Notes

Math

Epistemology

Search

Andrius Kulikauskas

  • m a t h 4 w i s d o m - g m a i l
  • +370 607 27 665
  • My work is in the Public Domain for all to share freely.

用中文

  • 读物 书 影片 维基百科

Introduction E9F5FC

Questions FFFFC0

Software


See: Classical Lie groups

Study the possible constraints on the matrices for the elements of a Lie algebra.


Notation

Let {$L$} be a Lie algebra whose elements are matrices for which {$[A,B]=AB-BA$}.

Given a variable with two indices, {$a_{xy}$}, interpret it as a path from x to y and as a matrix whose entries are all zero except for the element {$a_{xy}$}. Given an expansion of r such variables, {$a_{x_{1} y_{1}} a_{x_{2} y_{2}} \cdot \dots \cdot a_{x_{r} y_{r}}$}, if {$y_s = x_{s+1} \forall s (1 \leq s < r)$}, then interpret it as a path from {$x_{1}$} to {$x_{2}$} and so on all the way up to {$x_{r}$}, and otherwise it identify it with zero. Given a sum of expansions, identify it with a set of paths. Interpret the sign of an expansion as identifying whether it is read from left to right (positive sign) or from right to left (negative sign), which is to say, with the slack on the left (positive sign) or the slack on the right (negative sign).

Given a path {$a_{x_{1} y_{1}} a_{x_{2} y_{2}} \cdot \dots \cdot a_{x_{r} y_{r}}$}, consider it as defining a matrix {$M$} for which the entry {$M_{x_1,y_r} = a_{x_{1} y_{1}} a_{x_{2} y_{2}} \cdot \dots \cdot a_{x_{r} y_{r}}$} and all other entries are {$0$}.

Lie bracket

{$[a_{ij},b_{kl}]$}{$ = a_{ij}b_{kl} - b_{kl}a_{ij}$} 
{$=0$}{$=0-0$}if {$j \neq k $} and {$l \neq i $}
{$=a_{ij}b_{jl}$}if {$j = k $} and {$l \neq i $}
{$=- b_{ki}a_{ij}$}if {$j \neq k $} and {$l = i $}
{$=0$}{$=a_{ij}b_{ji} - b_{ji}a_{ij}$}if {$j=k $} and {$l=i $}

The Lie bracket thus combines two paths into a new path, if possible. A loop is not allowed.

The Jacobi identity falls out as a consequence upon defining the Lie bracket as {$[a_{ij},b_{kl}] = a_{ij}b_{kl} - b_{kl}a_{ij}$}. Combinatorially, any term {$a_{ij}b_{jl}c_{ln}$} can be broken up at two points, j and l, and thus arises with positive sign from {$[a_{ij}, [b_{jl},c_{ln}]]$} and with negative sign from {$[c_{ln}, [a_{ij}, b_{jl}]]$} so that they cancel out in the Jacobi identity.

Trace

Given two matrices A and B, we have the following identity:

{$Tr(AB) = \sum_{i}\sum_{j}a_{ij}b_{ji} = \sum_{j}\sum_{i}b_{ji}a_{ij} = Tr(BA)$}

Consequently, {$Tr([A,B])=Tr(AB-BA)=0$}

This assures that the matrices A for which {$Tr(A)=0$} are closed under the Lie bracket.

Eigenvectors for {$A_n$}

Consider {$H$} in the center of a Lie algebra {$L$}, so that {$H$} commutes with all elements of {$L$}. Then {$H$} is a diagonal matrix {${h_{ii}}$}. Suppose that {$H \cdot e_{ij} = w \cdot e_{ij}$} for some eigenvalue {$w$}. Then {$w = h_{ii}-h_{jj}$} captures the end points of the path {$e_{ij}$} as follows:

{$H \cdot e_{ij} = h_{ii}e_{ij} - e_{ij}h_{jj} = (h_{ii} - h_{jj}) e_{ij} $}

If {$i<j$}, then {$e_{ij}$} is in the upper triangle and {$e_{ji}$} is in the lower triangle, and we have:

{$H \cdot e_{ji} = h_{jj}e_{ji} - e_{ji}h_{ii} = (h_{jj} - h_{ii}) e_{ji} $}

If {$i=j$}, then

{$H \cdot e_{ii} = h_{ii}e_{ii} - e_{ii}h_{ii} = 0 $}

Thus the roots {$h_{ii}-h_{jj}$} for all {$i \neq j$} are inherent for all Lie algebras.

More eigenvectors?

What additional eigenvectors can we get if we play with symmetry? We can name the indices so as to make explicit the symmetry in counting backwards and forwards. Define indices:

{$\check{n} < \dots \check{3} < \check{2} < \check{1} < \hat{1} < \hat{2} < \hat{3} \dots < \check{n}$}

If the indices are used for the sides of a matrix, then they establish symmetries: left and right, and top and bottom.

{$ \begin{pmatrix} a_{\check{i}\check{j}} & a_{\check{i}\hat{j}}\\ a_{\hat{i}\check{j}} & a_{\hat{i}\hat{j}} \end{pmatrix}$}

These two symmetries are unified on the diagonal: {$a_{\check{i}\check{i}}$} and {$a_{\hat{i}\hat{i}}$}

This establishes a symmetry across the anti-diagonal, relating {$a_{\check{i}\check{j}}$} and {$a_{\hat{j}\hat{i}}$}, {$a_{\check{i}\hat{j}}$} and {$a_{\check{j}\hat{i}}$}, and {$a_{\hat{i}\check{j}}$} and {$a_{\hat{j}\check{i}}$}.

{$ \begin{pmatrix} {\begin{pmatrix} a_{\check{i}\check{j}} & \\ & \end{pmatrix}} & {\begin{pmatrix} a_{\check{i}\hat{j}} & \\ & a_{\check{j}\hat{i}} \end{pmatrix}} \\ {\begin{pmatrix} a_{\check{i}\hat{j}} & \\ & a_{\check{j}\hat{i}} \end{pmatrix}} & {\begin{pmatrix} & \\ & a_{\hat{j}\hat{i}} \end{pmatrix}} \end{pmatrix} $}

Given such a symmetry, what constraint on matrices M might stay invariant under the Lie bracket?

Define {$m_{\check{i}\check{j}} = \omega \cdot m_{\hat{j}\hat{i}}$}, {$m_{\check{i}\hat{j}} = \lambda \cdot m_{\check{j}\hat{i}}$}, and {$m_{\hat{i}\check{j}} = \tau \cdot m_{\hat{j}\check{i}}$}.

Then calculate {$[A,B]$}:

{$ \begin{pmatrix} a_{\check{i}\check{j}} & & a_{\check{i}\hat{j}} & \\ & & & a_{\check{j}\hat{i}} \\ a_{\hat{i}\check{j}} & & & \\ & a_{\hat{j}\check{i}} & & a_{\hat{j}\hat{i}} \end{pmatrix}$} {$ \begin{pmatrix} b_{\check{i}\check{j}} & & b_{\check{i}\hat{j}} & \\ & & & b_{\check{j}\hat{i}} \\ b_{\hat{i}\check{j}} & & & \\ & b_{\hat{j}\check{i}} & & b_{\hat{j}\hat{i}} \end{pmatrix}$} {$-$} {$ \begin{pmatrix} b_{\check{i}\check{j}} & & b_{\check{i}\hat{j}} & \\ & & & b_{\check{j}\hat{i}} \\ b_{\hat{i}\check{j}} & & & \\ & b_{\hat{j}\check{i}} & & b_{\hat{j}\hat{i}} \end{pmatrix}$} {$ \begin{pmatrix} a_{\check{i}\check{j}} & & a_{\check{i}\hat{j}} & \\ & & & a_{\check{j}\hat{i}} \\ a_{\hat{i}\check{j}} & & & \\ & a_{\hat{j}\check{i}} & & a_{\hat{j}\hat{i}} \end{pmatrix}$}

{$$[A,B]_{\check{i}\check{j}} = \sum_{\check{k}}(a_{\check{i}\check{k}}b_{\check{k}\check{j}}-b_{\check{i}\check{k}}a_{\check{k}\check{j}})+\sum_{\hat{k}}(a_{\check{i}\hat{k}}b_{\hat{k}\check{j}}-b_{\check{i}\hat{k}}a_{\hat{k}\check{j}})$$}

{$$= \sum_{\check{k}}({\omega}^2 \cdot a_{\hat{k}\hat{i}}b_{\hat{j}\hat{k}}- {\omega}^2 \cdot b_{\hat{k}\hat{i}}a_{\hat{j}\hat{k}})+\sum_{\hat{k}}(\lambda \tau \cdot a_{\check{k}\hat{i}}b_{\hat{j}\check{k}}-\lambda \tau \cdot b_{\check{k}\hat{i}}a_{\hat{j}\check{k}})$$}

{$$= \sum_{\check{k}}({\omega}^2 \cdot b_{\hat{j}\hat{k}}a_{\hat{k}\hat{i}} - {\omega}^2 \cdot a_{\hat{j}\hat{k}}b_{\hat{k}\hat{i}})+\sum_{\hat{k}}(\lambda \tau \cdot b_{\hat{j}\check{k}}a_{\check{k}\hat{i}}-\lambda \tau \cdot a_{\hat{j}\check{k}}b_{\check{k}\hat{i}})$$}

{$$= -{\omega}^2 \cdot \sum_{\check{k}}(a_{\hat{j}\hat{k}}b_{\hat{k}\hat{i}} - b_{\hat{j}\hat{k}}a_{\hat{k}\hat{i}}) - \lambda \tau \cdot \sum_{\hat{k}}(a_{\hat{j}\check{k}}b_{\check{k}\hat{i}} - b_{\hat{j}\check{k}}a_{\check{k}\hat{i}})$$}

{$$= - \lambda \tau \cdot \sum_{\hat{k}}(a_{\hat{j}\check{k}}b_{\check{k}\hat{i}} - b_{\hat{j}\check{k}}a_{\check{k}\hat{i}}) - {\omega}^2 \cdot \sum_{\check{k}}(a_{\hat{j}\hat{k}}b_{\hat{k}\hat{i}} - b_{\hat{j}\hat{k}}a_{\hat{k}\hat{i}}) $$}

{$$= - \lambda \tau \cdot \sum_{\check{k}}(a_{\hat{j}\check{k}}b_{\check{k}\hat{i}} - b_{\hat{j}\check{k}}a_{\check{k}\hat{i}}) - {\omega}^2 \cdot \sum_{\hat{k}}(a_{\hat{j}\hat{k}}b_{\hat{k}\hat{i}} - b_{\hat{j}\hat{k}}a_{\hat{k}\hat{i}}) $$}

{$$[A,B]_{\hat{j}\hat{i}} = \sum_{\check{k}}(a_{\hat{j}\check{k}}b_{\check{k}\hat{i}}-b_{\hat{j}\check{k}}a_{\check{k}\hat{i}})+\sum_{\hat{k}}(a_{\hat{j}\hat{k}}b_{\hat{k}\hat{i}}-b_{\hat{j}\hat{k}}a_{\hat{k}\hat{i}})$$}

Then {$[A,B]_{\check{i}\check{j}} = \omega \cdot [A,B]_{\hat{j}\hat{i}}$} implies that {$ - \lambda \tau = \omega $} and {$ - {\omega}^2 = \omega $}. The latter forces {$ \omega = -1 $}. We also conclude that {$ \lambda \tau = 1$}. Thus {$ \lambda = \tau = 1$} or {$ \lambda = \tau = -1$}.

{$$[A,B]_{\check{i}\hat{j}} = \sum_{\check{k}}(a_{\check{i}\check{k}}b_{\check{k}\hat{j}}-b_{\check{i}\check{k}}a_{\check{k}\hat{j}})+\sum_{\hat{k}}(a_{\check{i}\hat{k}}b_{\hat{k}\hat{j}}-b_{\check{i}\hat{k}}a_{\hat{k}\hat{j}})$$}

{$$= \sum_{\check{k}}(\omega \cdot a_{\hat{k}\hat{i}} \lambda \cdot b_{\check{j}\hat{k}}-\omega \cdot b_{\hat{k}\hat{i}} \lambda \cdot a_{\check{j}\hat{k}})+\sum_{\hat{k}}(\lambda \cdot a_{\check{k}\hat{i}} \omega \cdot b_{\check{j}\check{k}} - \lambda \cdot b_{\check{k}\hat{i}} \omega \cdot a_{\check{j}\check{k}})$$}

{$$= - \omega \lambda \cdot \sum_{\check{k}}(b_{\hat{k}\hat{i}} a_{\check{j}\hat{k}} - a_{\hat{k}\hat{i}} b_{\check{j}\hat{k}}) - \lambda \omega \cdot \sum_{\hat{k}}(b_{\check{k}\hat{i}}a_{\check{j}\check{k}} - a_{\check{k}\hat{i}}b_{\check{j}\check{k}})$$}

{$$= - \omega \lambda \cdot \sum_{\check{k}}(a_{\check{j}\hat{k}}b_{\hat{k}\hat{i}} - b_{\check{j}\hat{k}}a_{\hat{k}\hat{i}}) - \lambda \omega \cdot \sum_{\hat{k}}(a_{\check{j}\check{k}}b_{\check{k}\hat{i}} - b_{\check{j}\check{k}}a_{\check{k}\hat{i}})$$}

{$$= - \lambda \omega \cdot \sum_{\hat{k}}(a_{\check{j}\check{k}}b_{\check{k}\hat{i}} - b_{\check{j}\check{k}}a_{\check{k}\hat{i}}) - \omega \lambda \cdot \sum_{\check{k}}(a_{\check{j}\hat{k}}b_{\hat{k}\hat{i}} - b_{\check{j}\hat{k}}a_{\hat{k}\hat{i}})$$}

{$$= - \lambda \omega \cdot \sum_{\check{k}}(a_{\check{j}\check{k}}b_{\check{k}\hat{i}} - b_{\check{j}\check{k}}a_{\check{k}\hat{i}}) - \omega \lambda \cdot \sum_{\hat{k}}(a_{\check{j}\hat{k}}b_{\hat{k}\hat{i}} - b_{\check{j}\hat{k}}a_{\hat{k}\hat{i}})$$}

{$$[A,B]_{\check{j}\hat{i}} = \sum_{\check{k}}(a_{\check{j}\check{k}}b_{\check{k}\hat{i}}-b_{\check{j}\check{k}}a_{\check{k}\hat{i}})+\sum_{\hat{k}}(a_{\check{j}\hat{k}}b_{\hat{k}\hat{i}}-b_{\check{j}\hat{k}}a_{\hat{k}\hat{i}})$$}

Here {$[A,B]_{\check{i}\hat{j}} = \lambda \cdot [A,B]_{\check{j}\hat{i}}$} implies that {$ - \lambda \omega = \lambda $} and that {$ - \omega \lambda = \lambda $}. In either case, all that we conclude is that {$\omega = -1 $}. By symmetry, we draw the same conclusion from {$[A,B]_{\hat{i}\check{j}} = \tau \cdot [A,B]_{\hat{j}\check{i}}$}. So these cases are uninformative compared to {$[A,B]_{\check{i}\check{j}} = \omega \cdot [A,B]_{\hat{j}\hat{i}}$}.

Consequences

Our Lie algebra has roots {$h_{ii}-h_{jj}$} for all {$i \neq j$}. Note that the minus sign orders the components. We can interpret the ordering in two ways: from left to right, and from right to left. Thus let us explicitly label the right-to-left interpretation {$h_{\check{i}\check{i}}-h_{\check{j}\check{j}}$} and let us label the left-to-right interpretation {$h_{\hat{i}\hat{i}}-h_{\hat{j}\hat{j}}$}. Then we can also have mixed modes {$h_{\check{i}\check{i}}-h_{\hat{j}\hat{j}}$} and {$h_{\hat{i}\hat{i}}-h_{\check{j}\check{j}}$}.

We can relate the interpretations algebraically if we write the right-to-left interpretation as negative, so that {$h_{\check{i}\check{i}}-h_{\check{j}\check{j}}$} is {$-h_{ii}--h_{jj} = -h_{ii}+h_{jj}$}, and the left-to-right interpretation as positive, so that {$h_{\hat{i}\hat{i}}-h_{\hat{j}\hat{j}}$} is {$h_{ii}-h_{jj}$}. Then the mixed mode {$h_{\check{i}\check{i}}-h_{\hat{j}\hat{j}}$} yields {$-h_{ii}-h_{jj}$} and the mixed mode {$h_{\hat{i}\hat{i}}-h_{\check{j}\check{j}}$} yields {$h_{ii}--h_{jj} = h_{ii}+h_{jj}$}. In other words, on the matrix diagonal we have:

{$$\begin{pmatrix}-h_{jj} & & & \\ & -h_{ii} & & \\ & & h_{ii} & \\ & & & h_{jj}\end{pmatrix}$$}

Edit - Upload - History - Print - Recent changes
Search:
This page was last changed on November 19, 2018, at 01:31 PM