The Simulation Argument Revisited in a Functional Universe

What if the most commonly overlooked assumption of the simulation argument is also the most consequential? That assumption concerns time itself.


Irreducible duration of interactions

The Hidden Temporal Assumption

Nick Bostrom’s simulation argument rests on a deceptively simple statistical premise: if sufficiently advanced civilizations can run vast numbers of simulated conscious observers cheaply and at scale, then simulated minds will vastly outnumber biological ones. From this numerical dominance, it follows that we should assign a high probability to being simulated ourselves.

What is often left implicit, however, is a strong assumption about time, namely, that simulated realities can be executed arbitrarily faster than the physical universe in which they are implemented.

The Functional Universe (FU) challenges this assumption at its root.

Time as Transition, Not Parameter

In the Functional Universe, time is not an external parameter that indexes state evolution. It does not “flow” independently of physical processes, nor can it be rescaled at will. Instead, time is defined operationally as the irreducible duration of interaction required for a transition between functional states.

Formally:
  • States $f_n$ are not static objects but stabilized functional interfaces.
  • Change occurs only through transitions: $f_n \xrightarrow{d\tau} f_{n+1}$
  • Each transition requires a finite, non-zero duration $d\tau$, bounded below by physical constraints.
Thus, time is not an independent variable; it is the accumulated cost of causal commitment. To exist in time is to undergo transitions.

What It Means to Simulate a Universe

Bostrom’s argument implicitly treats simulations as compressible replays of reality: the same causal structure, executed more quickly on superior hardware.

The Functional Universe draws a stricter distinction:
  • A simulation that merely computes outcomes without preserving causal structure is not a universe.
  • A faithful reproduction of a universe must be an emulation - a system that instantiates the same class of transitions, subject to the same irreducible interaction constraints.
If a simulated universe preserves:
  • causal locality,
  • finite propagation speeds,
  • non-zero transition durations $d\tau$,
  • compositional state evolution,
then its causal history cannot be arbitrarily accelerated without altering the physics being instantiated. A universe that “runs faster” is not our universe reproduced; it is a different functional system with different transition constraints.

The Impossibility of Arbitrary Speedup

Within the Functional Universe, causally dependent transitions cannot be skipped, parallelized, or compressed beyond their irreducible interaction costs. Let a conscious process correspond to a chain of committed transitions: 

$\gamma = \{ f_0 \xrightarrow{d\tau_0} f_1 \xrightarrow{d\tau_1} \cdots \xrightarrow{d\tau_{N-1}} f_N \}$

The experienced duration, i.e., the proper time accumulated by the chain of transitions of that process, is:

$\tau(\gamma) = \sum_{i=0}^{N-1} d\tau_i$

No amount of external computational power can reduce this sum without changing the process itself, because each $d\tau_i$ corresponds to a physically irreducible interaction. Thus, the notion of “running millions of lifetimes per second” presupposes that conscious experience is independent of transition cost. In the Functional Universe, it is not.

Consequences for the Simulation Argument

Once time is grounded in transition rather than computation, several consequences follow:
  • Simulated universes are not cheap.
  • Simulated minds cannot be mass-produced by simple clock-speed increases.
  • Causally faithful simulations cannot vastly outnumber base realities through acceleration alone.
At best, a simulator could instantiate another universe in real time, or more plausibly, at a slower rate due to overhead and resource contention. The combinatorial explosion of observers required by the simulation argument no longer follows.

Probability Without Fast-Forwarding

Bostrom’s probabilistic conclusion depends critically on the assumption that simulations can dominate observer counts through speed and scale. The Functional Universe removes the mechanism that produces this dominance. Without arbitrarily accelerated realities, the number of simulated observers is constrained by the same transition budgets that govern any physical universe. The statistical asymmetry between simulated and non-simulated minds collapses. This does not disprove the simulation hypothesis, but it deflates its probabilistic force.

A More Modest Simulation Hypothesis

If we inhabit a simulation within the Functional Universe framework, it is not a replay, a shortcut, or a compressed copy.

It is a world that:
  • unfolds through irreducible transitions,
  • accumulates time through causal commitment,
  • and cannot outrun its own future.
In this sense, the metaphysical gap between “simulated” and “real” narrows considerably. Both are functional universes constrained by transition cost. Both must wait for events to occur. The Functional Universe does not deny the possibility of simulations. It denies their assumed abundance.

By grounding time in interaction rather than computation, it removes the illusion of unlimited acceleration that powers the simulation argument. What remains is a quieter, more disciplined metaphysics, one in which reality cannot be fast-forwarded, even by its creators. If the universe is computable, it may also be computationally irreducible in time. And if so, then no one - simulated or otherwise - gets to skip ahead.

Coarse-Grained Simulations and the Role of $d\tau$

The Functional Universe does not prohibit simulations outright. What it prohibits is arbitrary fidelity at arbitrary speed. A simulator may, in principle, construct a coarse-grained simulation by increasing the effective transition duration:

$d\tau ;\longrightarrow; \tilde{d\tau} \gg d\tau_{\min}$

Such a system compresses history by discarding intermediate transitions, retaining only a subset of committed states:

$f_n \mapsto \tilde{f}*k \approx f*{n+m}, \qquad m \gg 1$

This produces a dynamics that is cheaper to run, but it is no longer transition-faithful. Accelerating a simulation necessarily discards transitions, altering the universe’s functional structure.

Loss of Functional Equivalence

A faithful emulation of a universe must preserve:
  1. causal ordering,
  2. interaction locality,
  3. minimum transition duration,
  4. aggregation-composition structure.
Coarse-graining violates at least one of these. Enlarging $d\tau$ implies:
  • fewer committed transitions per unit proper time,
  • suppressed aggregation structure,
  • altered decoherence thresholds,
  • distorted causal density.
Formally, the simulation implements a different transition algebra:

$\tilde{\mathcal{C}} \circ \tilde{\mathcal{A}} \neq \mathcal{C} \circ \mathcal{A}$

The resulting system is not the same universe at lower resolution; it is a different functional universe.

Conscious Processes Under Coarse-Graining

If conscious experience corresponds to chains of committed transitions,

$\tau(\gamma) = \sum_i d\tau_i$

then coarse-graining replaces many micro-transitions with a single macro-transition.

This has two consequences:
  • Internal phenomenology, if any, is altered.
  • Causal responsiveness is reduced.
A coarse-grained simulation may contain observers, but they are not observers of the same kind. Their experiences are not blurred versions of ours; they are governed by a different transition economy.

Implications for the Simulation Argument

This distinction collapses the key ambiguity in the simulation hypothesis:
  • Yes, cheap simulations may exist.
  • No, cheap faithful simulations cannot.
Bostrom’s argument requires high-fidelity simulations that preserve the causal structure relevant to conscious experience. Coarse-grained systems do not meet this requirement.

Thus:
  • Large numbers of low-fidelity simulations do not imply large numbers of observers like us.
  • Observer counting must be weighted by transition fidelity, not raw instance count.
Final Conclusion

The Functional Universe framework is presented here as a conceptual lens and computational analogy. It does not assert metaphysical primacy, but demonstrates how physical time constraints fundamentally limit the simulation argument.

Faithful worlds cannot be fast-forwarded. Fast worlds cannot be faithful.

The Functional Universe permits simulations, but denies the assumption that reality can be duplicated at scale without paying the full cost of its transitions.

Annex: Formal Constraint on Bostrom’s Trilemma

A1. Bostrom’s Trilemma (Formal Statement)

At least one of the following propositions must be true:
  1. Almost no civilizations reach a posthuman stage capable of running ancestor simulations.
  2. Posthuman civilizations have little interest in running such simulations.
  3. Almost all observers with experiences like ours live in simulations.
The inference to (3) depends on the assumption that observer-moments can be generated at arbitrarily high rates by accelerating simulations.

A2. Hidden Assumption: Unbounded Temporal Compression

Let:
  • $N_{\text{base}}$ = number of observers in base reality
  • $N_{\text{sim}}$ = number of simulated observers
  • $R$ = rate of observer-moment production
Bostrom’s dominance condition is effectively:

$N_{\text{sim}} \gg N_{\text{base}} \quad \text{because} \quad R_{\text{sim}} \gg R_{\text{base}}$

This presupposes:

$R_{\text{sim}} \propto \frac{1}{d\tau_{\text{sim}}}, \qquad d\tau_{\text{sim}} \to 0$

A3. Functional Universe Constraint

Postulate (Irreducible Transition Time):

$d\tau \ge d\tau_{\min} > 0$

for any causally faithful transition, independent of substrate.

Time is not a parameter assigned to computation; it is the duration of interaction itself.

A4. Fidelity Constraint

A fidelity-preserving simulation must satisfy:

$d\tau_{\text{sim}} \ge d\tau_{\text{phys}}$

Any attempt to run faster implies coarse-graining and loss of functional equivalence:

$\text{Speedup} \Rightarrow \text{Loss of equivalence}$

A5. Bounding Simulated Observers

If observer-moments correspond to chains of committed transitions:

$\tau(\gamma) = \sum_i d\tau_i$

then for energy budget $E$:

$N_{\text{sim}} \le \frac{E}{k_B T \ln 2} \cdot \frac{1}{d\tau_{\min}}$

This bound applies equally to base and simulated realities.

A6. Modified Trilemma

Under Functional Universe constraints:
  1. Few civilizations reach simulation maturity, or
  2. Civilizations avoid the immense cost of faithful simulations, or
  3. Simulated observers exist but do not dominate the observer reference class.
The original probabilistic conclusion no longer follows.

A7. Final Statement

Bostrom’s argument loses force not because simulations are impossible, but because time is not compressible. Once time is treated as an irreducible physical product rather than a free computational parameter, the numerical force of the simulation argument largely disappears.

The universe cannot be fast-forwarded - even by its simulators.

Comments

Popular posts from this blog

Math Education: What If We Started with Sets and Groups Instead of Numbers?

Rethinking Luggage Privacy in the Age of Oversharing

Brazil’s PIX System Exposed to Legal Risk for Withholding Its Source Code