Becoming anti de Sitter

In the last post we were discussing Killing vector fields of the group SL(2,R). It was done without specifying any reason for doing it – except that it somehow came in our way naturally. But now there is an opportunity to relate our theme to something that is fashionable in theoretical physics: holographic principle and AdS/CFT correspondence

We were playing with AdS without knowing it. Here AdS stands for “anti-de-Sitter” space. Let us therefore look into the content of one pedagogical paper dealing with the subject: “Anti de Sitter space, squashed and stretched” by Ingemar Beengtsson and Patrik Sandrin . We will not be squashing and stretching – not yet. Our task is “to connect” to what other people are doing. Let us start reading Section 2 of the paper “Geodetic congruence in anti-de Sitter space“. There we read:

For the 2+1 dimensional case the definition can be reformulated in an interesting way. Anti-de Sitter space can be regarded as the group manifold of SL(2,{\bf R}), that is as the set of matrices

(1)   \begin{equation*} A = \left[ \begin{array}{cc} V+X & Y+U \\ Y-U & V-X \end{array} \right] \ , \hspace{10mm} \mbox{det}A = U^2 + V^2 - X^2 - Y^2 = 1 \ .  \end{equation*}

It is clear that every SL(2,R) matrix A=\left[\begin{smallmatrix}\alpha&\beta\\ \gamma&\delta\end{smallmatrix}\right] can be uniquely written in the above form.

But Section 2 starts with something else:

\noindent Anti-de Sitter space is defined as a quadric surface embedded in a flat space of signature (+ \dots +--). Thus 2+1 dimensional anti-de Sitter space is defined as the hypersurface

(2)   \begin{equation*} X^2 + Y^2 - U^2 - V^2 = - 1 \end{equation*}

\noindent embedded in a 4 dimensional flat space with the metric

(3)   \begin{equation*} ds^2 = dX^2 + dY^2 - dU^2 - dV^2 \ . \end{equation*}

\noindent The Killing vectors are denoted J_{XY} = X\partial_Y - Y\partial_X,
J_{XU} = X\partial_U + U\partial_X, and so on. The topology is now
{\bf R}^2 \times {\bf S}^1, and one may wish to go to the covering
space in order to remove the closed timelike curves. Our arguments
will mostly not depend on whether this final step is taken.

For the 2+1 dimensional case the definition can be reformulated in an interesting way. Anti-de Sitter space can be regarded as the group manifold of SL(2,{\bf R}), that is as the set of matrices

(4)   \begin{equation*} g = \left[ \begin{array}{cc} V+X & Y+U \\ Y-U & V-X \end{array} \right] \ , \hspace{10mm} \mbox{det}g = U^2 + V^2 - X^2 - Y^2 = 1 \ .  \end{equation*}

\noindent The group manifold is equipped with its natural metric, which is invariant under transformations g \rightarrow g_1gg_2^{-1}, g_1, g_2 \in SL(2, {\bf R}). The Killing vectors can now be organized into two orthonormal and mutually commuting sets,

(5)   \begin{eqnarray*} & J_1 = - J_{XU} - J_{YV} \hspace{15mm} & \tilde{J}_1 =  - J_{XU} + J_{YV} \\ & J_2 = - J_{XV} + J_{YU} \hspace{15mm} & \tilde{J}_2 = - J_{XV} - J_{YU} \\ & J_0 = - J_{XY} - J_{UV} \hspace{15mm} & \tilde{J}_0 = J_{XY} - J_{UV} \ . \end{eqnarray*}

\noindent They obey

(6)   \begin{equation*} ||J_1||^2 = ||J_2||^2 = - ||J_0||^2 = 1 \ , \hspace{3mm} ||\tilde{J}_1||^2 = ||\tilde{J}_2||^2 = - ||\tilde{J}_0||^2 = 1 \ . \end{equation*}

The story here is this: 2\times 2 real matrices form a four-dimensional real vector space. We can use \alpha,\beta,\gamma,\delta or V,U,X,Y as coordinates y^1,y^2,y^3,y^4 there. The condition of being of determinant one defines a three-dimensional hypersurface in \mathbf{R}^4. We can endow \mathbf{R}^4 with scalar product determined by the matrix G defined by:

(7)   \begin{equation*}G=\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&-1&0\\0&0&0&-1\end{bmatrix}.\end{equation*}

The scalar product is then defined as

(8)   \begin{equation*}(y,y')=y^TGy'=G_{ij}y^{i}y'^{j}=y^1y'^1+y^2y'^2-y^3y'^3-y^4y'^4.\end{equation*}

This scalar product is invariant with respect to the group SO(2,2) of 4\times 4 real matrices A satisfying:

(9)   \begin{equation*}A^TGA=G.\end{equation*}

That is, if A\in O(2,2) then (Ay,Ay')=(y,y') for all y,y' in \mathbf{R}^4.

What I will be writing now is “elementary”in the sense that “everybody in the business” knows it, and if asked will often not be able to tell where and when she/he learned it. But this is a blog, and the subject is so pretty that it would be a pity if people “not in the business” would miss it.

The equation (2) can then be written as (y,y)=-1. It determines a “generalized hyperboloid” in \mathbf{R}^4 that is invariant with respect to the action of O(2,2). Thus the situation is analogous to the one we have seen in The disk and the hyperbolic model. There we had the Poincaré disk realized as a two-dimensional hyperboloid in a three-dimensional space with signature (2,1), here we have SL(2,R) realized as a generalized hyperboloid in four-dimensional space with signature (2,2). Before it was the group O(2,1) that was acting on the hyperboloid, now is the group O(2,2). Let us look at the vector fields of the generators of this group. By differentiating Eq. (9) at group identity we find that each generator \Xi must satisfy the equation:

(10)   \begin{equation*}\Xi^TG+G\Xi=0.\end{equation*}

This equation can be also written as

(11)   \begin{equation*}(G\Xi)^T+G\Xi=0.\end{equation*}

Thus G\Xi must be antisymmetric. In n dimensions the space of antisymmetric matrices is n(n-1)/2-dimensional. For us n=4, therefore the Lie algebra so(2,2) is 6-dimensional, like the Lie algebra so(4) – they are simply related by matrix multiplication \Xi\mapsto G\Xi. We need a basis in so(2,2), so let us start with a basis in so(4). Let M_{(\mu\nu)} denote the elementary antisymmetric matrix that has 1 in row \mu, column \nu and -1 in row \nu column \mu for \mu\neq \nu, and zeros everywhere else. In a formula

    \[ (M_{(\mu\nu)})_{\alpha\beta}=\delta_{\alpha\mu}\delta_{\beta\nu}-\delta_{\alpha\nu}\delta_{\beta\mu},\]

where \delta_{\mu\nu} is the Kronecker delta symbol: \delta_{\mu\nu}=1 for \mu=\nu, and =0 for \mu\neq\nu.

As we have mentioned above, the matrices \Xi_{(\mu\nu)}=G^{-1}M_{(\mu\nu)} form then the basis in the Lie algebra so(2,2). We can list them as follows

(12)   \begin{equation*}\Xi_{(12)}=\left[\begin{smallmatrix}0&1&0&0\\-1&0&0&0\\0&0&0&0\\0&0&0&0\end{smallmatrix}\right], \Xi_{(13)}=\left[\begin{smallmatrix}0&0&1&0\\0&0&0&0\\1&0&0&0\\0&0&0&0\end{smallmatrix}\right], \Xi_{(14)}=\left[\begin{smallmatrix}0&0&0&1\\0&0&0&0\\0&0&0&0\\1&0&0&0\end{smallmatrix}\right].\end{equation*}

(13)   \begin{equation*}\Xi_{(23)}=\left[\begin{smallmatrix}0&0&0&0\\0&0&1&0\\0&1&0&0\\0&0&0&0\end{smallmatrix}\right], \Xi_{(24)}=\left[\begin{smallmatrix}0&0&0&0\\0&0&0&1\\0&0&0&0\\0&1&0&0\end{smallmatrix}\right], \Xi_{(34)}=\left[\begin{smallmatrix}0&0&0&0\\0&0&0&0\\0&0&0&-1\\0&0&1&0\end{smallmatrix}\right].\end{equation*}

On the path to AdS

In the next post we will relate these generators to J_i,\tilde{J}_i from the Anti de Sitter paper by Bengtsson et al and to our Killing vector fields
\xi_{iL},\xi_{iR} from the last note