Introduction

Notes

Math

Epistemology

Search

Andrius Kulikauskas

  • m a t h 4 w i s d o m - g m a i l
  • +370 607 27 665
  • My work is in the Public Domain for all to share freely.

用中文

  • 读物 书 影片 维基百科

Introduction E9F5FC

Questions FFFFC0

Software


See: Math notebook, Tensor, Triviality, Duality

Understand tensor combinatorics as a generalization of matrix combinatorics.


  • Given my combinatorial intepretation of tensors, try to understand all of the basic facts about tensors in terms of it.
  • Interpret the article TensorAlgebra:A CombinatorialApproachtotheProjective GeometryofFigures
  • Relate it to symmetric functions on the eigenvalues of a matrix. What results does it yield about tensor combinatorics?
  • Study representation theory of symmetric functions and general linear group - try to relate Schur function of eigenvalues to tensor combinatorics.
  • Think of linearity as the size of apertures (or sides of the array boxes) entering and leaving the tensor. And consider the relation between doubling a scalar and doubling the number of components (dimensions) in a vector space.
  • Note that the vectors/covectors are components and so the components of a vector are subcomponents. Compare this to a power set as a set of subsets. Think of vectors as lists. In what sense is a tensor the analogue of a power set?
  • A tensor breaks up dimensions into its inputs (covectors) and its outputs (vectors). In what sense is this like factoring a number into two factors? What does it say if all of the factors are the same size, for example, primes?

A tensor {$T$} is defined as a multilinear map:

{$T:V^*\times \dots \times V^* \times V \times \dots \times V \rightarrow F$}

where the domain is the product of {$p$} copies of {$V^*$} and {$q$} copies of {$V$}.

A covector {$T$} is a linear functional {$T:V \rightarrow F$}. Here {$T$} acts as an inner product.

A vector {$T$} can be thought of as an element of {$V^{**}$}, namely {$T:V^* \rightarrow F$}. Here {$T$} acts by substitution.

A linear transformation {$T$} can be thought of as bringing together a covector and a vector, taking us from the covector to the vector. It is the paths that take us from the components of the covector to the components of the vector.

{$\begin{pmatrix} c_1 & c_2 \end{pmatrix} \begin{pmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \end{pmatrix} \begin{pmatrix} v_1 \\ v_2 \end{pmatrix} = c_1a_{11}v_1 + c_1a_{12}v_2 + c_2a_{21}v_1 + c_2a_{22}v_2$}

Combinatorially, a linear transformation is the paths from the components of a covector to the components of a vector.

More generally, a tensor is the paths from the (multidimensional) components of a product of covectors to the (multidimensional) components of a product of vectors.

Consequently, a tensor is simply a matrix whose two indices are further organized so that they each reference a multidimensional array rather than a row or a column. Note that linearity needs to hold in each dimension.

Combinatorially, what does it mean for linearity to hold in each dimension?

Combinatorially, what does linearity mean for a matrix as a linear transformation?

Linearity takes care of itself if we think of the tensor as a matrix which takes us from a multidimensional array, given by the covectors, to a different multidimensional array, given by the vectors. This 2-dimensional matrix consists of components indexed by the components of the input array and the components of the output array.

What does such a matrix mean in category theory?

In category theory, it gives all of the morphisms that link each object in the input array with each object in the output array.

In the case of a 2 + 2 tensor, it takes 2-dimensional arrays as inputs and yields them as outputs. Thus it is a map from the elements of matrices to the elements of matrices. We can thus study walks, cycles, Lyndon words, etc. on the elements in a matrix or other array.


Literature

Edit - Upload - History - Print - Recent changes
Search:
This page was last changed on August 05, 2019, at 06:26 PM