Elements of vector mathematics and piecewise linear analysis
are used to delineate and mathematically formalize each step in the process
by which the TimeWave Zero (TWZ) 384 number *data set* is generated.
This development begins with the King Wen hexagram sequence and proceeds
to the final 384 number data set, using standard mathematical procedures
and operations. The process is clarified and streamlined by the introduction
of vector notation and operations, which also preserves the notion of wave
"directed" flow, described by McKenna.

This 384 number data set serves as the input data file
for the TWZ software, which performs a "fractal" transform on the input
data in order to produce the output *TimeWave* viewed on the computer
screen as an x-y graph of *Novelty.* The basis for this data set is
the first order of difference (FOD) of the King Wen sequence, defined as
the number of lines that change as one moves from hexagram to hexagram,
beginning at hexagram 1 and proceeding to hexagram 64. This first order
of difference (FOD) number set and its derivatives are produced by a series
of clearly defined mathematical operations, which are all described in
detail.

Once this *revised* 384 number data set has been
calculated, it is used as input to the TWZ software in order to generate
*revised* *TimeWaves* that may be compared with the original
*standard*** ***TimeWaves*. Several *random* number sets
are also generated and used similarly to produce *random* *TimeWaves*
for comparison. Fourier transform operations are performed on each of the
384 number data sets, in order to examine wave noise and information content.
Correlation is used to determine the degree of interdependence between
the two *data* sets, and between the data and* random *number
set*s*.

The results of the mathematical formalization and subsequent
comparison analysis show that the *revised* data set produces a *TimeWave*
that appears to reflect historical process with greater accuracy than the
*standard TimeWave*. This difference is likely due to the fact that
the *standard* data set produces a distorted *TimeWave,* as the
result of imbedded mathematical errors that increase the noise level in
the wave. Comparisons of the *standard* and *revised* *data
sets* and *TimeWaves*, show a generally high degree of correlation,
inferring that the *standard wave* retains much of the information
content of the revised wave, despite its distortion. This *TimeWave*
information content, or the wave signal-to-noise ratio (s/n), is improved
by using the revised data set, which serves to correct the noise distortion
introduced by the standard wave.

__Introduction__

*TimeWave Zero* (TWZ) **[1]
**is a mathematical and graphical expression of the *Novelty
Theory* advanced by Terence McKenna, and implemented by computer software
called *Time Surfer *for Macintosh, and *Time Explorer *for DOS
operating systems. It is based on a specific mathematical relationship
exhibited by the King Wen sequence of the I-Ching - i.e. the number of
lines that change as one moves from one hexagram to the next, beginning
at hexagram 1 and proceeding to hexagram 64. This number set, called the
First Order of Difference (FOD), was first expressed and expanded by McKenna
**[2]** and others, into the TimeWave
that is produced by the TWZ software. The philosophical nature and theoretical
basis of the TimeWave, have been reported extensively elsewhere and will
not be discussed in detail here. However, the general thrust of Novelty*
Theory*, is that information about some fundamental natural process
is encoded in the I-Ching in general, and the FOD number set in particular.
This process is thought to express itself in nature and the cosmos, as
the ongoing creation and conservation of increasingly higher ordered states
of complex form. The *TimeWave* is then viewed as expressing this
process as a kind of fractal map of temporal resonance in nature, or as
an expression of the ebb and flow of an organizing principle called *Novelty*.

The conversion of this FOD number set into the TimeWave
(viewed on the TWZ computer screen as a graph of the Novelty process),
involves the performance of a series of mathematical procedures and operations
on this number set. The TimeWave is actually produced in two distinct and
mathematically different phases. The first phase includes the creation
of a simple bi-directional wave using the FOD number set. This wave is
then expanded into linear, trigramatic, and hexagramatic bi-directional
waves that are subsequently combined to form the *tri-level complex wave*,
or 384 number data set. The second phase is performed by the TWZ software
itself, which includes an expansion, or "fractal transform" of the 384
number data set (input file to TWZ) to produce the TimeWave viewed on the
computer screen. Phase I uses the mathematics of piecewise linear analysis
to generate the 384 number data set from the FOD number set, whereas Phase
II uses infinite series expansions, that are slightly more complex, to
convert the Phase I data set into the final TimeWave. The formalization
and comparison work described in this report is concerned only with the
Phase I mathematics.

Until recently, the details of the genesis and development
of Novelty Theory and the TimeWave, although available to all with the
will and energy to examine them, have remained largely out of sight and
out of mind for most. The primary focus has been on the results of that
development - i.e. the reflective and apparently projective characteristics
of Novelty Theory as expressed by the TimeWave, and graphed by the TWZ
software. That is, until Mathew Watkins**, **a
British mathematician proceeded to deconstruct the wave generating process
and examine those details more closely. The results of his investigation
were reported in a paper entitled Autopsy
for a Mathematical Hallucination **[3] ,
**linked to the McKenna website Hyperborea
as the Watkins Objection.

There were several things that Watkins found objectionable
in his scathing critique of *Novelty Theory* and *TWZ, *but there
was just one significant finding that he substantiated in his report. He
showed that one of the operational steps used in the production of the
384 number data set, the notorious "half twist", was not mathematically
consistent with the standard linear analysis that is implied by the documentation
in *the Invisible Landscape* and the *Time Explorer* software
manual. He pointed out the fact that the two number sets produced by first
the inclusion, then the exclusion of the half twist would be different
sets resulting in different *TimeWaves*. However, he didn't quantify
this difference in number sets, nor show what the resulting impact of the
final TimeWave would be. He then concluded that without some miraculous
justification for the "half twist", his findings would prove fatal to *TimeWave
Zero* and *Novelty Theory*. This conclusion seemed somewhat speculative
and overstated to me, since he hadn't actually shown what the impact of
his findings on the *TimeWave* itself would be. Nonetheless, it was
an important finding, so I decided to investigate the matter for myself
in order to assess the actual impact on the *TimeWave* and the corresponding
damage to *Novelty Theory.* This meant, of course, that I would have
to immerse myself in the details of the TWZ mathematical development.

Becoming familiar with the details of the mathematical development of TWZ proved to be more of a challenge than expected, partially because the available documentation lacked the necessary descriptive detail to faithfully reconstruct the process of TimeWave generation. Additionally, some of the mathematical operations were described with unconventional language that was somewhat confusing, making it more difficult to understand what was actually being done. So in order to clarify this process of wave generation, I proceeded to delineate and mathematically formalize each of the steps in the process that takes one from the King Wen hexagram sequence to the final 384 number data set - Phase I of the TimeWave generating process. I felt that it was important as well, that this formalization be done in a way that could be clearly visualized, in order to give one a mental picture of what might actually be happening as one proceeds through the development process. I felt that it should be more than merely a correct, but arcane, mathematical formulation.

An important feature of the standard development process,
clearly shown in all the *TimeWave Zero *documentation, is that the
process is expressed by piecewise linear mathematics - meaning simply that
the final 384 number data set is the result of the expansion and combination
of straight line segments. These linear segments are bounded by integers
that are derived from the FOD number set, although the actual inclusion
of the line segments establishes non-integer values in the set. Another
important and well-documented feature of the process, is the generation
of the simple bi-directional wave from the FOD number set. This bi-directional
wave consists of a forward and reverse flowing wave pair, and it is the
fundamental waveform or building block of the *TimeWave* generating
process. These two features, a piecewise linear nature and wave directed
flow, clearly lend themselves to expression through the principles of vector
mathematics. Vector notation and operations were consequently chosen as
appropriate tools for this modeling process.

It should be noted here, that there is nothing unique or exceptional about the use of vector mathematics. It is only one of several approaches that could have been used; but it is one that clearly expresses the notion of wave directed flow, and one that also has the capacity to generate straight-line segments. The fact is that only a few of the basic features of vectors are used here - vector addition and subtraction, and the vector parametric equation of the straight line. However, the generation of straight-line segments using vectors, converts the discrete function (integer values only) represented by the FOD number set, into a continuous function in the domain bounded by the FOD integers. This is important if the wave is to be well defined over the entire range of its expression (i.e. the inclusion of fractional values).

This work is the formalization of the procedures already established with the standard wave development, by McKenna, but one that removes inconsistencies and makes the process more coherent and intelligible. It does not, in any way, make fundamental changes in the development process, nor does it modify the underlying theory.