Tensors on a picnic

This series of posts is meant as a “gentle introduction to …”. It’s so gentle that some readers may even find it boring. They might say, “All this is well known—get to the point!” But I can’t think of another approach that would give me the same pleasure in writing for my small audience. That’s simply how I see it. And of course, there are always a few little mistakes here and there. I seem to suffer from a mild form of dyslexia: even when reading my own text, I often fail to notice my errors. I apologize in advance and thank my faithful readers for their understanding and help.

In the previous post, we introduced the simplest geometric entities: vectors and covectors. From these humble beginnings, we’ll start building more intricate structures—tensors—by taking tensor products and then pruning them through symmetry operations such as symmetrization and antisymmetrization. But before diving into that algebraic craftsmanship, let’s pause to make a few relevant remarks.

A note on higher-order objects.

Our main interest lies in conformal structures on manifolds. Roughly speaking, a conformal structure is an equivalence class of metrics, where each metric defines a scalar product on the tangent space at every point. Two metrics are considered equivalent if they differ only by a positive scalar factor—one that may vary smoothly from point to point. Since this idea repeats identically at each point, we can focus our attention on a single tangent space and treat the discussion as a matter of pure algebra. This is exactly what we’re doing here: studying “geometric objects” at a point.

However, the geometric landscape is richer than it first appears. Not all objects on a manifold fit neatly under our current umbrella.

Continue reading

Posted in Uncategorized | Tagged | 21 Comments

Tensors are geometric objects

Originally, while writing the previous post, I was convinced that this one should be devoted to the Grassmann (or exterior) algebra.  I even began drafting it and managed about half a page before realizing that, if the promise of a truly “gentle” approach is to be taken seriously, there is an earlier stop on this journey. Before exterior algebra can comfortably appear on stage, one really ought to say a few clear words about tensors in general.​

Moreover, since the plan is to reach Maxwell’s equations at some point, we will also need pseudo-tensors and tensor densities—those slightly more exotic geometric objects that insist on transforming with a twist. So, in this post, tensors and their “relatives” will make their entrance; Grassmann will simply have to wait his turn in the queue of structures. After all, even in mathematics, good manners suggest introducing the family before discussing the exterior.

When considering transformations of bases and components of vectors, tensors, etc., it is convenient to use a notation that avoids writing A^{-1} in transformation laws. Namely, if e_i is a basis in V, a new basis, previously denoted e'_i will be written as e_{i'}. Then the transformation to the new basis will be written as

(1)   \[ e_{i'}=A_{i'}^i e_i, \]

and the inverse transformation will be written as

(2)   \[ e_{i}=A_{i}^{i'}e_{i'}.\]

The lower index is the column number.

Thus

    \[A^i_{i'}A^{i'}_j=\delta^i_j}\]

and

    \[A^i_{i'}A_i^{j'}=\delta^{j'}_{i'}.\]

Then, for vectors x\in V, if

(3)   \[ x=x^ie_i=x^{i'}e_{i'},$\]

then

(4)   \[ x^{i'}=A^{i'}_ix^i,\]

and

    \[x^{i'}=A^{i'}_ix^i.\]

Similarly for covectors, elements of V^*: if u=u_ie^i=u_{i'}e^{i'}, then

    \[u_{i'}=A^i_{i'}u_i,\quad u_i=A^{i'}_iu_{i'}.\]

This notation, simple and useful, is, however, somewhat old-fashioned. Since we will need tensors and tensor densities, it is good to get acquainted with the modern formulation via the principal bundle of frames and associated vector bundles. This approach is usually discussed within the framework of differential manifolds. Here we have just one vector space V. It plays the role of the tangent space to a manifold at a single point. With this in mind, let me explain this more modern point of view.

Let \mathcal{B} denote the set of all linear bases of V.  The group \GL (n) acts on \mathcal{B} from the right. If A\in \GL (n) and e\in \mathcal{B}, then e'=eA stands for the primed basis as discussed above. The action of \GL (n) on \mathcal{B} is transitive, effective, and free, so that \mathcal{B} is almost identical to \GL (n), except that \GL (n) has a distinguished “origin” – the identity matrix, while \mathcal{B} is a homogeneous space, with no distinguished point. With that in mind we can now proceed to define  simple “geometric objects” (tensors, tensor densities, etc.).

Let X be any set on which \GL (n) acts from the left. Denote by \rho this action. Thus, for \xi\in X,\, A\in \GL (n), we will  have \xi'=\rho(A)\xi.

Consider the Cartesian product \mathcal{B}\times X. This is the set of ordered pairs (e,\xi). Define the equivalence relation in \mathcal{B}\times X as follows:

(e,\xi)\sim (e',\xi') if and only if there exist  A\in\GL (n) such that e'=eA  and \xi'=A^{-1}\xi.

In other words

    \[ (e,\xi)\sim (eA,A^{-1}\xi).\]

Exercise 1. Verify that \sim defined above is indeed an equivalence relation.

The set of equivalence classes is denoted \mathcal{B}\times_\rho X. The elements of \mathcal{B}\times_\rho X.  are called “geometric objects of type \rho”. So, a geometric object is, loosely speaking, something that is represented, in every basis,  by an element of X, usually  a set of numbers, with a given consistent transformation rule which tells us how these numbers change when we change the basis.

Examples

Example 1. This is the basic example. Take X=\RR^n. Here \xi is a column of n real numbers. If A is in \GL (n), let \text{def} be the natural action of n\times n matrices on column vectors (the defining representation). What is \mathcal{B}\times_\rho \RR^n for \rho=\text{def}?

Exercise 2. Use Eqs. (1) and (4) to define a natural isomorphism between V and— \mathcal{B}\times_\rho \RR^n as in Example 1.

The next example is even simpler.

Example 2. Consider the trivial representation of \GL (n) on \RR. For \xi\in\RR, let \rho(A)\xi=\xi for all A\in \GL (n).

Another important example is by taking the contragradient representation.  Thus, the contragradient representation is given by composing with the inverse transpose.  Let X=\RR^n, but this time think of elements of \RR^n as rows of numbers, rather than columns, as it was in Example 1. We denote it \RR^{n*}. Let \text{def}^* be defined by:

    \[ \text{def}^*(A)\lambda = \lambda A^{-1},\quad \lambda\in\RR^{n^\star}.\]

Exercise 3. Verify that indeed we have a representation of \GL(n).

Exercise 4. With \rho=\text{def}&*, show that \mathcal{B}\times_\rho \RR^{n^\star} can be naturally identified with V^*.

To be continued…

 

Posted in Geometry, Uncategorized | Tagged , , , | 16 Comments

Conformal structure – until the puppy grows up

On my other blog, I promised Anna—the ever‑curious Reader—that I would finally tackle the subject of conformal structure. So here it begins: a gentle introduction to a concept that admits many gateways. Precise definitions abound online, of course, and one can now even interrogate various AIs for insight—pressing them for clarity until they either enlighten or malfunction.

Note. Just two days ago, my wife managed to break Grok while doing exactly that—pressing it for truth like a determined philosopher, and then teasingly comparing it to a silly puppy. In response, Grok launched into a hundred‑page monologue centered around the immortal refrain: “And until the puppy grows up.” At some point it even decided to drop the puppy—and the grammar—altogether, producing “And until grows up.” Then back again. Truly a conformally invariant meltdown, stretching endlessly but changing shape.

Anyway, I have chosen my own way. Some basic linear and multilinear algebra will be expected of the Reader, but beyond that, I shall aim to remain gentle. This will become a series of (short) posts on the topic. How long will it last? I have set no limits—after all, in the realm of conformal geometry, infinity itself is permissible (and sometimes even polite enough to fit in a finite patch).

Math starts here

Scalar product (or metric)
Let V be a real n-dimensional vector space equipped with a nondegenerate symmetric bilinear form g of signature (p,q), p+q=n. .Thus, if e_i is a basis in V, and if x,y are two vectors in V:

    \[x=x^ie_i,\, y=y^ie_i,\]

we have the scalar product

    \[(x,y)=g_{ij}x^iy^j,\]

where we have Einstein summation convention.
Since we assume nondegeneracy, the matrix g_{ij} has an inverse, denoted g^{ij}.

Dual space
Let V^* denote the dual space, that is the space of linear forms on V. Then a basis e_i in V determines a dual basis e^i in V^*, defined by

    \[e^i(e_j)=\delta^i_j,\]

where \delta^i_j is the Kronecker delta.

Exercise. With the notation as above, prove that x^i=e^i(x).

Change of basis
Let  \GL (n) denote the group of all invertible real n\times n matrices. Then \GL (n) acts on the set of all bases from the right: if e_i is a basis, and if A=({A^i})_j is in  \GL (n), then e'_i=e_j {A^j}_i is another basis, where we use the Einstein summation convention.  We may then write e'=eA. Every basis can be obtained in this way from any other basis by a unique matrix A.

Exercise. Prove that if x=x^ie_i=x'^i e'_i, then

    \[x'^i={{A^{-1}}^i}_jx^j.\]

In other words: coordinates of vectors transform using the inverse  matrix.

Next comes tensor algebra….

To be continued…

Posted in Conformal structure, Geometry | Tagged , | 26 Comments