MORE RESULTS FROM MULTIPRODUCT IN MULTIVECTORS

British mathematician, William Kingdon Clifford (1845-1879), intuited from Hamilton's research that EXPANSION OF PRIMARY UNITS IS BINARY: 1 (REAL), 2 (COMPLEX), 4 (QUATERNION), 8 (OCTONION), etc. And Clifford developed OCTONIONS.


All of Linear Algebra can be DERIVED by ADDITION and MULTIPRODUCT, and easily extended to MULTILINEAR ALGEBRA. But this can be improved, because, in MULTIVECTOR THEORY (contrary to the Gibbs-Heaviside VECTOR ALGEBRA), EVERY NON-NULL VECTOR, v, HAS AN INVERSE,

v¯1 = v/v·v, since: vÄv¯1 = v·(v/v·v) + vÙ(v/v·v) = 1 + 0 = 1, befitting an INVERSE. [6]

When the INVERSE COMMUTES, it BEHAVES LIKE DIVISION. This means that, in MULTIVECTOR THEORY, a VECTOR SPACE can be declared over a FIELD, rather than a RING, as in the standard formulation.

ASSIGNMENT: Find and evocatively label an INDICATOR of COMMUTATIVITY OF INVERSE, and DEFINE THE TYPE OF VECTOR SPACE OBTAINED OVER SUCH A FIELD.

As we see below, THE INVERSE MULTIVECTOR induces a powerful "conjugacy" ALGORITHM for TRANSFORMATION, as in MATRIX THEORY: C¯1AC = B. We've noted that a vector, as segment of a line, represents that line and gives a direction on it. And a bivector, as a section of a plane, represents that plane and gives it a rotational direction. Let a arise (say, obliquely) out of the B-plane, so we can study its projection onto the plane and its projection on the normal to that plane: the former is the PROJECTION of a with respect to the B-plane, denoted a||; the latter is the REJECTION of a with respect to the B-plane, denoted a^. Clearly, a = a|| + a^, where a|| = a ·BÄB¯1, a^ = a · BÄB¯1, and B¯1 = B/|B|2.

Given directions a, b, with q the angle between these directions. Then cos q is the PROJECTION of one direction on the other, and sin q is its REJECTION.

For UNIT VECTORS, a, b, we then have: a · b = cos q; a Ù b = i sin q, where i is the UNIT-DIRECTOR of the a Ù b-plane. [7]

Then, aÄb = a · b + a Ù b = cos q + i sin q. [8]

For u, v as unit (not necessarily basis) vectors, and u · v = cos q, and u Ù v = i sin q, we find u Ä v = u · v + u Ù v = cos q + i sin q = exp(i q ); and u Ùv = exp(i q).

We now allow u Ù v and v Äu to act as operators on a general vector, x:


           vÄu x uÄv = exp(i q) x exp(-i q) = x exp(i 2q).
Thus, the general vector, x, rotates though an angle 2q; but each operating multivector rotates only though an angle 2q, which is the behavior of a spinor ("square root of a vector").

Consider x1 + ix2.[9] We can use this to show that [5] is the simplest form of a SPINOR.

In [9], let x1 = cos q, and x2 = sin q, so we have: eiq = cos q + i sin q = x1 + x2i, or eiq = x1 + x2i. [10]

Please recall that, in [3], [4], [5], we see A VECTOR USED AS AN OPERATOR ON A BIVECTOR OR SUM OF SCALAR AND BIVECTOR. In GENERAL, A MULTIVECTOR CAN ACT AS A LEFT- OR RIGHT-HAND OPERATOR UPON A MULTIVECTOR. Let's apply this POWER to [10].

We call up "the conjugation format", used with SUBGROUPS FIND NORMAL SUBGROUPS ("prime subgroups"), and used with MATRICES to FIND EIGENVALUES of MATRICES. Abstractly, with O for a general OPERATOR; with O, as its CONJUGATE; with P as a GENERAL OPERAND, we have the CONJUGATION FORMAT: OPO. [11]

Given a unit vector, u, and a general vector, x, we decompose x into a component, x||, collinear with u, adjoining this collinear component to a component, x^, orthogonal to u, resulting in the decomposition: x = x|| + x^ [12].

Let's CONSTRUCT this. We form, x||Äu = x||·u + x||Ùu. (The 2nd summand, in black, is NULL, since COLLINEARITY & ORTHOGONALITY oppose.) Hence, we have x||Äu = x||·u; or simply, x||Äu = x·u.[12a]

And we've noted, elsewhere, that EVERY NONNULL VECTOR (in MULTIVECTOR THEORY) HAS AN INVERSE. Here, unit vector, u has an INVERSE, u¯1, such that uÄu¯1 = 1. Then, from [1a], we have x||ÄuÄu¯1 = x·uÄu¯1 Ä x|| = x·uÄu¯1 [13], since the multiplier of x on the left is UNITY.

ASSIGNMENT: Show that similar steps, when applied to the ORTHOGONAL COMPONENT, lead to x^Äu = xÙu.[12b] And to x^ = xÙuÄu¯1[14].

Now, you can find that u commutes with x|| and anticommutes with x^. Hence, uÄxÄu = uÄ(x|| + x^)Äu = x||x^ (subtraction showing ANTICOMMUTATIVITY). [15]

Then, NEGATING [4], we have a "reflector", R1 = ¯(uÄxÄu) such that: R1 = ¯(uÄxÄu) = x^x||. [16]

Thus, R1 reverses the sign of the components of any vector, x along unit vector u, sending it into its "mirror-image". That is, R1 is a reflection operator, outputting an improper transformation -- compared to a rotation, which is a proper transformation.

ASSIGNMENT: Show that the self-product of ¯(uÄxÄu) RESTORES vector u, hence, acts like "a binary switch". (Math Students will recognize that this satisfies S2, the symmetric group on 2 elements.)

Consider now another "reflector", R2 = ¯(vÄxÄv), for unit vector, v. And we form their PRODUCT, as a LINEAR TRANSFORMATION: L = R2ÄR1 = L ÄxÄL , where L = uÄ v = u·v + uÙv = eA and L = uÄv = u·vuÙv = e¯A.

Using [1], LÄxÄL = LÄ (x|| + x^)ÄL = x^ + x||ÄL2 = x^ + x||Äe2A. [17]

Comparing [6] and [5], we see that [6] is a proper transformation: a rotation through Ð2A. But L effects only HALF of that, a ROTATION through ÐA, as also does its CONJUGATE, L. This means that -- as a vector rotates through 360 ° or 2p TO RETURN TO ITS ORIGINAL POSITION -- a STRUCTURE such as L must ROTATE[ through 720 ° or 4p to RETURN TO ITS FORMER POSITION. Such a STRUCTURE is a SPINOR.

Above, we noted that RFLECTIONS COMBINE INTO A ROTATION. This is the content of
HAMILTON'S THEOREM: A ROTATION IS THE PRODUCT OF TWO REFLECTIONS.

This leads to the finding of E. Cartan (1869-1951): CARTAN'S THEOREM: EVERY ISOMETRY CAN BE CONSTRUCTED FROM REFLECTIONS.

Thus, THE ARITHMETIC OF MULTIVECTORS IS AN ARITHMETIC OF REFLECTIONS AND ROTATIONS.


Every rotation operator, R can be factored into the above mutiproduct of spinor-vector-spinor. Hestenes revived 19th century terminology to describe this. A term factorable into a multiproduct of 1-vectors is a versor. A scalar is a 0-versor. This resembles factoring integers into products of primes, except that this is UNIQUE, whereas the versor-factoring is not unique. A special case of a versor is a rotor: a versor such that the rotation operators relate UNITARILY: RÄR = 1. A versor is invertible, if, and only if, invariant under inversions. The multiplicative group of invertible versors is "The Clifford Group". The multiplicative group of even versors is the spin group.

In New Foundations of Classical Mechanics, p. 283, Hestenes writes of the rotation operator, R: "Thus R and ¯R are equivalent rotations with opposite senses [his italics] ... The representation of a rotation as a linear transformation in the form RxR or as an orthogonal matrix does not distingish between these two possibilities. Therefore, spinors provide a more general representation of rotations than orthogonal matrices. Specifically, each unimodular spinor represents a unique orientation [his italics], whereas each orthogonal matrix represents an unoriented rotation [his italics]."


Any matrix of scalars, sij can be written as inner product of vectors: sij = a·b. And det sij = det aibj = (anÙ...Ùa1)(b1Ù...Ùbn).

Thus, all properties of determinants derive from inner, outer products.


In the "multiproduct" file, we found the VECTOR FORM of Euler's equation, [8]: aÄb = a · b + a Ù b = cos q + i sin q = eiq. Then, bÄa = b · a + b Ù a = cos q - i sin q = iq. [8a]

Subtracting [8a] from [8] (so that their equal inner products cancel out), we have: aÙbb Ùa = aÙb + aÙb = 2aÙb = ei — e-i aÙb = (ei — e-i)/2 = sinh .[18]

On the other hand, aÙb + bÙa = eiq + e¯iq 1/2(aÙb + bÙa) = 1/2(eiq + e¯iq) = cosh q. [19]

And the rest of HYPERBOLIC FUNCTION THEORY follows.

(In the rotation operator, R, REPLACING COSINE AND SINE, respectively, by COSH AND SINH RESULTS IN ANOTHER ROTATION OPERATOR, from which THE BASIC EQUATIONS OF SPECIAL RELATIVITY MAY BE DERIVED. This is in "The Pauli Algebra", equivalent to QUATERNIONS.)

Other FUNCTIONS involving "complex exponentials", such as BESSEL FUNCTIONS can also be derived.


The concept of a "universal Multivector Theory" can be set forth with BASIS UNITS, di, i = 1,2,...,n, where the SQUARE OF BASIS UNITS, i = 0,1,...,s is +1, but ¯1 for i = s+1,...,n. Whereas standard complex analysis is limited to the PLANE, Brackx and his colleagues have set s = 1 and generalized complex analysis from the plane to (n+1)-dimensions. In the process, they have generalized Cauchy's Theorem, Cauchy's Integral Formula, The Mean Value Theorem, Taylor and Laurent and Fourier Series, etc. There exist n possible complex numbers of the form, z = x0 + ydi, i = 1,2,...,n. Also, monogenic functions (generalizations of analytic functions) are constructed as linear combinations of symmetrized products of those z-functions which satisfy Cauchy-Riemann equations. The generalized concept of residue leads to Green's functions and analytic extension from Rn to Rn + 1, and to generalizations of distributions and Fourier and Laplace Transforms.

This concludes a brief look at THE ARITHMETIC OF CLIFFORD NUMBERS (a.k.a. CLIFFORD ALGEBRA, a.k.a. MULTIVECTOR THEORY, a.k.a. GEOMETRIC ALGEBRA). You may return to "The Frontpage"