Time is homogeneous sequentially-composable determination

Time is the character of courses of events in which determinations——the ways that one event determines the next——are uniform across events and across composing determinations.

Thanks to Sam Eisenstat, Scott Garrabrant, Brady Pelkey, and Kyle Scott for related conversations.

1. Reasons to talk about time

Why talk about time? Here are some key concepts that are bound up with time, and so make demands on the understanding of time:

  • Stability. Suppose a human understands another mind well enough to know that the other mind will have effects the human desires. That understanding (arguably) has to describe something in some sense fixed about the other mind. What the understanding describes about the mind is, in other words, stable in the mind. Stability is an in some sense dynamical concept, since it says: "This could change over time, but it doesn't.".
  • Counterfactuals, agency. The notion of affecting something is, pretheoretically, temporal. "If I do this, then what will happen afterward as a consequence of that decision?"
  • Values coming to be. The fundamental question "What determines a mind's effects?" asks about the mind's telotect——what brings about that the mind pushes the world in some direction, and what selects the telopheme, which says what direction the mind pushes the world. Across ontological crises, and across the process of becoming more coherent starting from nothing, and across the process of unfolding reference through novelty, a mind has to somehow arrive at how it will affect [the world as understood by the new understanding] (for example, by FIAT).
  • Change, tendency. In general, there may be mental clines——elements that by their nature tend to change in some direction. An element is provisional when it hasn't yet followed all its clines. Clines follow their tendencies over time.
  • Dynamics. In general, dynamical concepts depend on time. E.g.: attraction basins, in/stability, dis/equilibrium, identity, forces, path dependence. These concepts beg questions about the temporal context like: {stability, etc.} of what? Moving through what space, crossing what intermediate points? Over the course of what events?

See also "Saving Time" and "Finite Factored Sets".

2. Bearers of time

Here are some overlapping aspects of the cosmos that have some timelikeness:

  • Ordinary processes. Rain falling, wheels rolling, iron rusting, plants growing, teeth chewing. Ordinary things going on through sequences of familiar events via regular processes driven by ordinary forces.  
  • Synchronous, comparable processes. It turns out that clocks in different places and times (very nearly, aside from relativity) agree with each other. And, clocks continue to correspond to other processes in the same way as they would correspond in any other place and time. For example, an apple will ripen after the same number of clock ticks (in distribution), no matter if the apple and the clock are placed on the floor in this or that corner of an empty, sealed, symmetric room. Two clocks sent to two different cities will stay synced for a long time, so that if they're reunited they'll mark nearby minutes. A train, shuttling regularly between the two cities (with no rest stops), will mark off [a number of clock ticks of the first city's clock between departing and returning at that city] that is the same (on average) as [the number of clock ticks of the second city's clock between departing and returning at the second city].
  • Phenomenal, subject-internal processes. Regardless of an external world, a mind could have an internal sense of time. The mind can notice sensory irritations changing, or can count purely abstractly. In either case, the "same" mental element has two different states, and so it marks off two moments. (The "same" sensory element lights up in different ways, or the "same" element "appears in two places at once" and so is recognized as multiple-of-the-same where previously there was only one-of-the-same.) Compare Brouwer's "two-oneness".
  • Self-relating, entity-internal processes. Things have their own internal elements and relations, which unfold under their own pressures.
    • Most simply, a machine's operation might mostly depend on internal relations of its components, like gears meshing and wires connecting, and internal events like switches being flipped. The electric kettle shuts itself off, all on its own, in its own time. What changes about a component is salient insofar as how it relates to other components, e.g. by being in contact with another component.
    • The internal unfolding marks off the entity's own time. My phone might take 2 seconds or 20 seconds to open a browser app, and then when it's done it goes on to reload the webpage——the events proceed in their own fixed order, but not on a fixed sidereal timetable. The crystal grows, not by the clock, but whensoever another particle of the right kind comes to one of the right places.
    • As a special case of unfolding, a programmed sequence of changes can unfold according to a set schedule proceeding by cumulative stages, e.g. in the growth of living things: the seed, the sprout, the leaves, the bud, the bloom.
    • Growing minds have a progressive structure that comes from what mental elements unlock the creation of what other mental elements. Coming to know things has a partial ordering.
    • Tendencies internal to a system drive the system's changes, giving the system a self-driven character. Art and math progress to deeper abstraction under the force of reflection; tectonic plates drift and grind; orbiting spheroids dance with all of each other at once; languages cycle between isolating and agglutinative tendencies, under competing forces of expressivity and conserving effort (van Gelderen); species fall down fitness cliffs of unlocked niches or phenotypic characters, escalate intraspecific competition, and get sucked into runaway perceptual selection.
    • See Herb Simon on "nearly decomposable systems".
    • Most systems are not totally separated from the rest of the cosmos, and many systems are very interwoven with other systems. For example, human-historical events like wars, treaties, states, governments, and economic production, are heavily related to things that aren't very "historical", like disease, culture, the weather, and math. There's some characteristic internal "historical forces" and "historical events", but not cleanly separated from other events and progressions.
  • Dynamical systems. E.g. Hamiltonian mechanics.
  • Causal systems. See Pearl's theory. If $A$ causes $B$, then $A$ comes before $B$.
  • Information. If a variable $X$ depends on $Y$, so that you have to know $Y$ in order to know $X$, then $X$ comes after (or at least, not before) $Y$ in some sense. See "Finite Factored Sets".
  • Computation. The internal events of a computation are ordered by one event's need to use the outcome of another event: if you're running mergesort you have to first compute the sorted version of the first half of the list, and then use that to compute the sorted version of the whole list, not vice versa. Also, computational complexity induces a time order on tasks: which can be finished first?
  • Logical implication. If $A$ logically implies $B$, and not vice versa, then $B$ comes after $A$. First you know $A$, and then as a consequence you come to know $B$.
  • Probabilistic updating. If hypothesis $H$ predicts $E$ more strongly than the average of how strongly hypotheses predict $E$, then observing $E$ increases the probability of $H$. Probabilistic updating reverses the direction of time compared to logical implication.
  • Definition, founding. A concept that is constituted by another concept is founded on that other concept.
    • For example, the idea of a group is founded on the idea of a set and the idea of a function.
    • An idea $I$ founded on an other idea $I'$ comes "after" the other idea $I'$, in the sense that——at least in many contexts——it makes sense to explain $I$ after explaining $I'$.
    • Note that there's no single ordering of conceptual founding. For example, the idea of a group could also be founded on a pretheoretical familiarity with systems of isometric transformations of geometric objects, without using a clear, explicit notion of set or function.
    • A computer, or the hypothetical "reader with no prerequisite experience assumed", is more of a blank slate than a natural person. There's some orderings of founding that make sense for a blank slate. A formal proof system, whether it uses first-order logic or type theory, whether it uses set theory or category theory, won't define groups by appealing to a pretheoretic familiarity with rigid motions of geometric objects.
  • Discovery, invention. In some cases, $A$ strongly tends (across possible minds) to be discovered or invented before $B$.
    • For example, you don't discover the classification of simple Lie groups before discovering the idea of Lie groups.
    • Discovery time often reverses logical implication, e.g. many pairs of large cardinal hypotheses were discovered out of order of logical strength. For example, Mahlo cardinals are naturally an idea following after inaccessible cardinals, and Mahlo (1911) cites Hausdorff (1908) where inaccessibles are introduced, but the existence of Mahlo cardinals implies the existence of inaccessibles.
    • Discovery time also reverses or cross-cuts definition-time.
  • Justification, purpose. If the goal $G_1$ asks that you pursue the goal $G_2$, then $G_2$ is after $G_1$. First you want $G_1$, and then as a consequence you come to want $G_2$. The right answer to some form of the question "What is the cause of the male deer having antlers?" is "Because it's better for a male deer if he wins the head-bashing fight with other male deer.".
  • Design, orchestration, organization.
    • When designing something, high-level considerations are worked out, and then are unfolded into a more detailed design that specifies narrower, more concrete features.
    • Complex mental objects such as plans and theories are also designed.
    • Quoting from "A Pattern Language" by Christopher Alexander, Murray Silverstein, and Sara Ishikawa (IPFS): "The patterns are ordered, beginning with the very largest, for regions and towns, then working down through neighborhoods, clusters of buildings, buildings, rooms and alcoves, ending finally with details of construction. This order, which is presented as a straight linear sequence, is essential to the way the language works. [...] What is most important about this sequence, is that it is based on the connections between the patterns. Each pattern is connected to certain "larger" patterns which come above it in the language; and to certain "smaller" patterns which come below it in the language. The pattern helps to complete those larger patterns which are "above" it, and is itself completed by those smaller patterns which are "below" it."
    • Quoting from "The nature of technology" by W. Brian Arthur (IPFS): "[A] technology is always organized around a central concept or principle: "the method of the thing," or essential idea that allows it to work. The principle of a clock is to count the beats of some stable frequency. The principle of radar——its essential idea——is to send out high-frequency radio waves and detect distant objects by analyzing the reflections of these signals from the objects' surfaces. The principle of a laser printer is to use a computer-controlled laser to "write" images onto a copier drum. As with basic xerography, toner particles then stick to the parts of the drum that have been electrostatically written on, and are fused from there onto paper. To be brought into physical reality a principle needs to be expressed in the form of physical components. In practice this means that a technology consists of a main assembly: an overall backbone of the device or method that executes its base principle. This backbone is supported by other assemblies to take care of its working, regulate its function, feed it with energy, and perform other subsidiary tasks. So the primary structure of a technology consists of a main assembly that carries out its base function plus a set of subassemblies that support this."
  • Emergence. Atop force fields, there emerge particles; atop particles, there emerge molecules and chemical reactions; atop chemical reactions, there emerges life; atop life there emerges mind. Atop water and wind there emerge waves. Atop gas there emerge stars.
  • Construction. Where exactly the foundations were lain constrains where the posts can be erected, which constrains where exactly the drywall can be hung.
  • Self-transformation.
    • Nomic is a game where players can change the rules of the game, including rules about how rules are followed and changed.
    • History is like a big game of Nomic.
    • As a mind grows, the way in which it grows and coherentifies is shaped by something. But by what, and how? The shaping can seemingly come "from any direction" in the space of all structure. See "Values coming to be" above.
    • As a mind grows, it can shape itself for its own purposes. The mind's purposes are prior to themselves??
  • Decision.
    • An agent makes a decision. This puts the agent in a new situation, which presents more decisions for the agent to make.
    • An agent makes a decision. In response, another agent makes a decision.
    • An agent makes a decision. Consequences follow from that decision.
  • Reasoning in general. Thought $B$ naturally follows thought $A$.
  • Growing. An idea can start as an abstract, ungrounded, purposeless, vague, protean, seedlike, inexplicit, provisional, or inconsistent structure, and then grow into a fleshed out, meaningful, useful, concrete, precise, explicit, final, and consistent structure. Whereas design is a process that starts with a specific outcome, a mind doesn't know beforehand what will be the significance of a growing idea.
  • Ontological shifts. As a mind grows it gains new structure and becomes more coherent. The new structure is alien to the mind as the mind was previously, before gaining the new structure. This shift marks off moments in time.
    • A "gap" is opened between the mind as it was previously and as it is now. The mind climbs up a ladder, and then kicks the ladder out from under itself. For example, when an agent invents a new decision theory and then "adopts" it, it's as though a new agent has come into being. The new agent can more clearly coordinate with its current and future self than with its past self.
    • The mind has to interpret, translate, or create its values in the new language it speaks, e.g. by FIAT.
    • The mind is tempted to interpret the previous mind in current terms. E.g. it's tempting and kind of true to say that the Greeks were on their way to Cartesian algebraic geometry, but this is a description that would be sort of alien to the Greeks; they couldn't recognize it as a good description without going through major insights such as treating area and length as combinable in calculation.
  • Supervenience. A coarse or abstract structure can supervene, describe, or "organize" a refined or concrete structure.
    • Attractors in dynamical systems explain, or are "prior to", the endpoints of many particular trajectories of the dynamical system.
    • The mathematical operation of addition supervenes on the electrons moving through wires in a calculator, so that the result of the mathematical operation is "prior to" the lights that flash on the calculator's screen.
    • The basketball's trajectory is an "organizing core" of the massive ensemble of trajectories of all the atoms in the basketball.
    • Some mythological cosmogonies give sequential organizing cores for the cosmos.
  • Statistical mechanics. Say that there's a distribution over a system's current state. Suppose that the distribution is somewhat special——it concentrates mass (uniformly) in some special kind of state that's very rare. (For example, having all the gas particles in a box being in the left half of the box, or having a hot poker next to an ice cube.) Now evolve the system according to its rules of evolution. Assume also that the rules of evolution don't happen to in any way "somehow privilege" the special kind of state that the system is likely to be in (for example, the evolution doesn't always send every special state to another special state). Under these conditions, the evolution takes those special states all over the place randomly. Since the special states are very rare, most states aren't of that special kind. So most special states end up in non-special states. This gives an asymmetry: most non-special states also end up in non-special states, so the non-special kind of state is the unique kind of state (as opposed to the special kind) where most states (of any kind) will get to after some evolution.

  • Images of time.
    • A cross-section of a tree shows its growth over time projected into space.
    • Digging in the ground where people lived long ago brings up remnants of people in the reverse order of when they lived there.
    • Ontogeny recapitulates phylogeny, though not in a simple way. The sequence of bundles of evolutionary changes leading to a species is (partially and in a complicated way) reflected in the sequence of bundles of morphological changes that an embryo of that species passes through during development. The embryo is a sort of image of evolution, where layers of design-work stacked one on top of the other over evolutionary time are projected (messily) onto steps of construction-work executed one after the other.
    • I suspect there are many rhymes between the sequence of ideas and attitudes that humanity as a whole goes through, and the sequence of ideas that a person goes through. But the situation is very messy.
    • When a person starts to self-reflect much more intensely than before, they can go through a self-archaeology, digging further back in time, seeing layers of mental elements that had been built up. If society is set up to progressively "socialize" people in layered aspects, there will be rhymes between self-archaeology and the socialization layers, and hence between the self-archaeology and historical development of society (in reverse).
    • A scientist unlearns or unconditions on preconceptions, and so "returns" to a more eternal point of view. But curiously, this sort of archaeology always has to first happen "in reverse". Abstraction is a gain of structure, not a loss; minds start parochial, and then expand their domain of discourse.

3. Relationships between time-courses

Canonical global time

In some dynamical systems, there's a unique natural global time.

  • In some dynamical systems, the state of the system is "fully connected" with the future state. Anything about the state at one time can affect anything about the state at any future time. In other words, anything in the so-called "past" is actually in the past, in the sense that it might have affected (any given element of) the present.
  • As a toy example, imagine a system $S$ of finitely many equal-sized billiard balls moving in the plane $\mathbb{R}^2$. There are no forces (such as gravity or friction) other than from perfectly elastic collisions. Now, suppose we give a partial description of the state $S_t$ of $S$ at time $t$, e.g. by saying where some of the billiards are and what their velocities are. Does our description pin down the state $S_{t+\epsilon}$ of the system at time $t+ \epsilon$? If our partial description was a full description, then yes, $S_{t+\epsilon}$ is completely pinned down. If our description was strictly partial, e.g. we didn't say where one of the billiard balls is, then of course we haven't fully pinned down $S_{t+\epsilon}$; the last billiard ball could end up anywhere at that time. Even if our description was strictly partial, have we at least pinned down something about $S_{t+\epsilon}$? Strictly speaking, yes, because we've ensured that the system starting in state $S_{t+\epsilon}$ but with the velocities negated will, after being run forward for time $\epsilon$, end up satisfying our description. But we haven't ensured anything at all about where an individual billiard ball is, or what will be in any given place, at time $t+\epsilon$: the unspecified ball could knock any other ball around arbitrarily. In other words, the contents of any region of the state at time $t$ can strongly affect the contents of any other region after any positive amount of time has elapsed. Note that this example obeys relativity——the physical law is the same in any inertial frame——though it can't support a speed limit.
  • In Conway's Game of Life, states are not fully connected, and specifying a small region at one time does specify some region at some of the subsequent times. There is a unique global time in some sense though. For one thing, although there are lightcones and unaffectable stuff that hasn't happened yet, we can't actually draw any space-like surfaces other than the standard one. For another thing, you can tell if you're in inertial motion, so there's a natural unique frame (the rest frame). And, all observers (at rest or moving), using the naive notion of simultaneity, will agree on simultaneity.
  • However, even if there is a canonical global time in this sense, there can be a more complex structure of time-courses in the system. See below.

Canonical local time

  • In a relativistic world, there's no way to distinguish between inertial frames of reference (by postulate). This, along with the invariance of the speed of light, requires that if there's a natural notion of simultaneity for an observer, that notion of simultaneity as applied by another observer will make different judgements of simultaneity. In other words, there's no privileged global counting of time. (If the notion of simultaneity isn't required to be a natural function of the observer's observations, you can just say "measure simultaneity according to International Atomic Time, Earth, Sol, Milkyway".)
  • However, there is an absolute notion of "internal" or "local" time, called proper time. Everyone agrees on how many times a particular clock has ticked between the moment when it left one station aboard a train and the moment when it arrived at another station. This preserves some core comparability between processes over time. Apples ripen at the same rate, compared to a nearby clock, whether on Earth or on a near-lightspeed ship.

Forced shared time

Stuff goes on in the world, regardless of what you think, on its own time, and then that stuff can impinge on you. Time courses forcibly (or "objectively") interact with each other.

  • The physical universe gives one example of objective time.
  • Another example is acausal trade time. To make acausal trades, you might have to make decisions legibly, which means making some policy-level decision earlier in the time-order of decisions (e.g. "If you cooperate, then I'll cooperate.", or more complicated decisions like that). What was supposed to be timeless might just have a different time-course, and that time-course may be just as forcing as physical time.
  • A non-example would be separate dynamical systems. Their dynamical-system-timecourses run in parallel and don't interact.
  • A class of intermediate examples is the situation of many parallel timecourses. One example is some models of two-dimensional time, where a dynamical system allows time travel by finding a fixed point of the operation of allowing one run of the system to send emissaries to appear in the next run of the system. Another example is children, whose trajectories share many similar progressions. Children's trajectories are largely individual but with substantial interactions, e.g. a child can "send messages back in time" by communicating with younger children. (Adults too, though many adults are stuck in time.)

Criss-crossing timecourses

Timecourses——channels in which timelike sequences of events go on, connected in a timelike way——can cross each other.

§ Multiple tributaries

Timecourses can intersect in an event, coming from different directions.

  • Two things can contribute to the same event, so that their timecourses pass through the same event.
    • E.g. two rocks thrown at the same time at the same window ensure that the window breaks.
    • Two things's internal times can collide. E.g. if two people are playing capture the flag and both want to tag an enemy player who just picked up the flag, so they both sprint towards the enemy from different directions without noticing the other, their two internal courses of experience will then acutely affect each other by leading to a collision.
  • Coarse dynamics lead to coarse facts about an event in a dynamical system, and fine dynamics lead to fine facts. There's some "leakage" between the two timecourses, so they "intersect" (at an acute angle, going almost parallel) in that they both lead to the one event. E.g. when and where a thrown basketball hits the ground can be mostly understood as a result of the basketball's weight, gross volume, aggregate velocity, and aggregate experienced air friction and gravity. More precisely, it matters which exact air molecules hit the ball where and when, where are the bumps on the basketball and the concrete, how defects in the basketball's rubber affected how it deformed due to air friction, and so on. (Of course the finer dynamics, if understood completely, could stand on their own.)
  • If $5$, $12$, and $\times$ are entered into a calculator, it will show $60$. The event of the calculator showing $60$ follows from the physical dynamics of the calculator——the sequence of states of electrons moving through the wires. The event also follows from the logical structure of arithmetic: since the calculator implements integer multiplication when given $\times$ and since $5\times 12= 60$, the calculator shows $60$. The timecourse of logical implication or abstract computation intersects the timecourse of physical events and physical law, at less acute, maybe almost orthogonal angles.
  • The proto-vertebrates fell down a steep fitness gradient and gave rise to the vertebrates. This followed from the state of the genome pool and environment of the proto-vertebrates, through physical time and evolutionary time. From another perspective, it followed from the ("prior") existence of an attractor basin in designspace, "already there" for any species to stumble into, which "pulled in" the proto-vertebrates.

§ Loopy time

A timecourse can seem to go in a loop, so that something is in its own past.

Modified from https://pmontalb.github.io/TuringPatterns/.

  • Something can seem to cause its own existence. For example, why is there a red patch in that particular spot in the Turing pattern shown above? Answer: because there's a red patch there, so red is created more densely there. Going backwards in time from the red patch, we find the same red patch again.
  • Things can seem to bring each other about, in a cycle.
    • The wind pushing on the raised part of the wave causes the wave to become more raised, which causes the wind to catch on the wave more forcefully. The wind being caught comes after the wave being raised, and the wave being raised comes after the wind being caught.
    • A living thing is maintained in its existence by each part contributing to overall activity that leads to each part being maintained.
    • A person's specialization might cause itself. Since they are a bit better than normal at repairing shoes, they are asked a bit more frequently to repair shoes. Since they repair shoes more frequently than normal, they become better than normal at repairing shoes. The same might happen to a mental element playing its role in the mind——so that a natural idea comes in part from what a similar idea would tend to be called upon to be.
  • Since agency is time travel, agents create loops in time. E.g. self-fulfilling prophecies, basilisks, game theory outcomes that depend on common knowledge.

§ Directly opposed time

Timecourses can be directly opposed, flowing in opposite directions and colliding with each other.

  • 2-sided thermodynamic arrow of time.
    • Let $S_t$ be the state of the universe at time $t$. Suppose the universe is governed by some dynamical laws so that the universe obeys the second law of thermodynamics (I'm not sure exactly what to assume here). Suppose that the dynamical laws are reversible, so that the evolution of the universe will also satisfy the dynamical laws with time reversed.
    • If we have the maximum entropy prior over the state $S_0$, then we also have the maximum entropy prior over $S_t$ for any $t$.
    • If we then probabilistically condition on $S_t \in s$ where $s$ is some tiny, "natural" subset of statespace ("macrostate"), then we'll come to believe in an arrow of time. Our probability distribution over what $S_{t'}$ for $t' > t$ is will, as $t'$ increases, trend toward the maximum entropy distribution over "natural" subsets of statespace. If we see two macrostates $s_{t'}$ and $s_{t''}$ with $t'>t$ and $t''>t$, where $s_{t'}$ is lower entropy than $s_{t''}$, then we'll guess that $t'' > t'$.
    • Note that this also works backwards from time $t$. Since the dynamical laws are time-reversible, the situation, even after conditioning on $S_t \in s$, is symmetric under the transformation $t + s \leftrightarrow t-s$. So looking backwards from $S_t$, there's a backwards arrow of time. One imagines that before the big bang, there was a big weird crunch, where eggs unsmash and unlay and so on. Or more sensibly, that the big bang is like the summit of a mountain, with physics running forward down either side.
    • Suppose that we condition our maximum entropy prior on the conjunction of two properties: $S_t \in s$ and $S_{-t} \in s$. What does the resulting conditional distribution look like? I don't know. But I think it looks like a mostly-normal universe starting at time $-t$ and running forward in time, a mostly-normal universe starting at time $t$ and running backwards in time, and a weird zone in the middle around time $0$ where... I don't know what. But it seems like an instance of timecourse colliding in a directly opposed way——though maybe it would be righter to say that they synergistically push down the entropy in the middle zone?
  • Contradiction.
    • As logical time goes on, propositions grow an aura of other propositions that they imply. The auras of two propositions $P_0$ and $P_1$ might contradict each other, one saying $Q$ and the other saying $\neg Q$. In a context where both propositions to some extent "ought to" or "were trying to" be true, they then collide on $Q$. One or the other logical timecourse might go on, so that e.g. since $P_1$ leads to $Q$, it then leads to $\neg P_2$ via $Q$.
    • Constraints propagate as logical time goes on. Multiple, incompatible constraints might come to apply to one thing.

4. The thingness of time

Is time a Thing? Looking through the above list of timecourses, are there central shared features that lead into other relevant shared features?

Past, present, future

The past is something you can learn from, remember from; it's something you could have seen or otherwise sensed. What's past is in a sense locked in, irreversible; it can't be changed that, on Friday June 30 2023 at 10:13, I was typing this sentence. The past can also lock in or irreversibly cause enduring features of the cosmos. E.g. if a hard-drive is dissolved in a vat of acid, the data it uniquely stored is irretrievably lost. In some cases the past is something you could respond to, diagonalize against, erase, countervene, or reverse——you can react to affect the past's effects, though you can't affect the past. The past gave rise to you, made you what you are.

The future is something you can affect. It's something you can't see and can't have seen, except in imagination (or by waiting). It's something that is not locked in, that could be changed. You give rise to the future and to your future self.

The present is what you immediately——without mediation——sense and act upon. It's where plans cash out into actions, where actions are taken; it's the purview and responsibility of the actor. It's the last chance to affect things, where future becomes past.

Determination as the core of time

Time involves succession: an event B comes after another event A, and A comes before B. Why do we say "after", instead of just saying that there are two events, A and B, which are separate, just like two simultaneous events are separate?

One answer is: because we compare A and B against other processes, such as the spinning hands of a clock, and see the ordering of A and B relative to the familiar ordering of the clock-hand positions. This says something about when and how we say "after", but it doesn't say why we say "after".

Another answer: We say that B comes after A when A can affect B. If A determines, affects, or causes B——then A comes before B. If B depends on A, or if B is a consequence or implication of A, or if B is decided, determined, committed to, or locked-in by A——then B comes after A.

"Affect" is not quite the right word. It seems to emphasize physical causality and exclude some timecourses, e.g. logical constraints propagating or discoveries unlocking other discoveries. Instead:

Time is the course of determination.

This opens the black box of time and takes out a slightly different, still mysterious, black box called "determination". The question "What is determination?" or "How should I think about determination?" is almost the same question as "How should I think about counterfactuals?". The emphasis of "counterfactuals" is on the agent: What if I did this or that? What would then come as a consequence of my action? "Determination" emphasizes some partially objective structure——the reality that bites back, the reality that you can't get around just by having something different in your head. I suspect both forces have to be integrated: the root of determination in the actions of an agent, and the sturdy cage/scaffold of non-mind-determined reality.

Quoting the closing paragraphs of "Timeless Decision Theory" by Eliezer Yudkowsky:

I wish to keep the language of causality, including counterfactuals, while proposing that the language of change should be considered harmful. Just as previous statisticians tried to cast out causal language from statistics, I now wish to cast out the language of change from decision theory. I do not object to speaking of an object changing state from one time to another. I wish to cast out the language that speaks of futures, outcomes, or consequences being changed by decision or action.

What should fill the vacuum thus created? I propose that we should speak of determining the outcome. Does this seem like a mere matter of words? Then I propose that our concepts must be altered in such fashion, that we no longer find it counterintuitive to speak of a decision determining an outcome that is "already fixed." Let us take up the abhorred language of higher-order time, and say that the future is already determined. Determined by what? By the agent. The future is already written, and we are ourselves the writers. But, you reply, the agent's decision can change nothing in the grand system, for she herself is deterministic. There is the notion I wish to cast out from decision theory. I delete the harmful word change, and leave only the point that her decision determines the outcome——whether her decision is itself deterministic or not.


If A determines B and B determines C, then A determines C.

This transitivity formula specializes to the various timecourses. If A {logically implies, physically causes, gives rise to by emergence, unlocks the discovery of, structures the design of, sets the preconditions to decide, unfolds into, ...} B and B does the same to C, then A does the same to C.

The transitivity of determination sets up a genuine sequence of events connected by determination. The sequence [A,B,C] isn't just a collection, it has an ordering that consistently relates all the elements of the sequence to each other. What we call time is a time-course, a sequence of composed determinations.


There wouldn't be much power in the idea of time, or determination, if there weren't anything to say about it that applied across many determinations, or many connections between events. Physical law, and generally dynamical laws, are time-invariant. The connection between an event A and a subsequent event B——the way A determines B——greatly overlaps with the connection between B and a subsequent event C. For example, the same logical laws apply when deriving B from A and when deriving C from B.

That's how clocks work: what goes on in the clock, goes on in (roughly) the same way, however much it's already gone on.

There are other invariances, e.g. the invariance of physical law with respect to space or motion, or the invariance of the laws of logical deduction with respect to a difference of axioms. Time forms ordered strands, sequences of connections. Time is the character of homogeneity in determinations that can be composed sequentially, one after the other (so to speak). If $A\to B$ and $B\to C$, then also $A\to C$, and furthermore the way in which $A\to C$ is like the way in which $A\to B$ and $B\to C$. So we have the formula:

Time is homogeneous sequentially-composable determination.

From https://falseknees.com/297.html.