Swazee mark Product (type slug) Tool (chevron-wrench) Experiment (4-point star) Active (filled diamond) Shipped (hollow diamond) Shelved (diamond + slash) External link (↗) Search (angular magnifier) Filter (funnel) Close / Esc (chunky X) Move down (j) Move up (k) Return / Enter
SWAZEENET
VOL. I · NO. 01 · EST. MCMLXXXVII
BROADSIDE 12 active · 2026·05·14
№ 10 · experiment · C++ / HLSL / Three.js · 2025—

omega,
two layers arguing with each other.

Hybrid project: ~2,150 lines of research markdown on emergent gravity and holography, paired with a GPU-compute N-body cosmic simulator rendered both as a Three.js prototype and a UE 5.4 implementation.

physics ue5

omega is two layers that feed each other. The research layer is markdown essays --- adversarial falsification of Lorenz-EnKF, ΛCDM tensions, decoherence vs. "consciousness causes collapse", spheres as dimensional interfaces, the unified-theory synthesis. The implementation layer is a cosmic-evolution simulator built twice: a Node + Three.js web prototype at 262K particles, and an Unreal Engine 5.4 project that targets 500K-1M particles at 60fps. The engine code is a physical realization of the research arguments — Verlinde emergent gravity, Ryu-Takayanagi holography, softened N-body with Plummer ε=50 UU.

IResearch documents

The five canonical documents are the primary deliverable of the research layer — not summaries or trimmings, and not regenerated from prompts on demand. They are read in order, because the synthesis depends on the four enquiries that lead to it, and a reader who skips ahead to the unified theory will see conclusions whose evidence lives in the documents they did not read:

IIImplementation

GPU-compute first. For particle and field work, Niagara compute modules in UE 5.4 and Three.js custom shaders in the web prototype are preferred over CPU loops — not as an optimization but as a structural choice. The N-body math has to live where the particles are, which on modern hardware is the GPU; running it on the CPU and shipping coordinates to the GPU per frame would saturate the bus before it saturated either compute unit.

Plummer softening is present in every N-body kernel because the unsoftened gravitational potential goes singular at small radii and the simulator has no business reporting F = ∞. UE 5.4 uses ε = 50 Unreal Units (~0.5 m at world-scale) in GravitationalForce.ush and DensityWaveSpiral.ush; the web prototype uses per-mode scene-unit constants in the 0.5–3.0 range because Three.js world-scale is not Unreal Units and a single ε would either over-soften some modes or fail to soften others. The 262K-particle web ceiling is what Three.js can keep at 60 fps on the target GPU; the 1M target for UE 5.4 is a Niagara compute-shader budget, not a CPU one.

UE 5.4 source layout: GenesisUE5/Source/GenesisUE5/{Core,Particles,Audio,Camera,UI}/; shaders under GenesisUE5/Shaders/{Niagara,Materials}/. The handoff doc carries a deliberate "continuous re-contextualization" directive: every working session begins by verifying filesystem state against the inventory rather than trusting stale claims, because a research codebase that mixes generated assets, shader edits, and Niagara graph changes drifts from its README within a few sessions if that habit is not enforced.

IIITwo layers, one system

The implementation is a physical realization of the research arguments, not an illustration of them. Verlinde’s emergent-gravity proposal lives in the same simulator as the Ryu-Takayanagi-flavored holography term, because the unified-theory synthesis treats them as facets of a single substrate rather than as competing models. When the simulator runs a galaxy-scale N-body with both terms enabled and reproduces the MOND a0 behavior at the right radius, that is the research argument advancing on a screen, not a separate demo.

IVAcknowledged limits

The unified-theory synthesis is explicit about the open problems and treats them as design constraints rather than failures: the de Sitter holography gap remains open, the framework does not predict the CMB power spectrum from first principles, Verlinde gravity has observational mismatches at cluster scale that have not been resolved, the background-independence question is unresolved, and there is no proposed direct-detection experiment for the entanglement-substrate model.

The simulator does not pretend to resolve any of these. What it does is let an observer watch the consequences of accepting them — if entanglement does generate geometry and gravity emerges from that, what does a galaxy actually look like at 1M particles with both terms enabled? The answer is a different and more honest contribution than a paper claiming closure on problems that are still open.

Fig. I.
Core 1024 43% UI 412 17% Particles · 332 Camera · 307 Audio · 286
as of 2026-04-26
Fig. II.
01force calc 02Plummer soften 03integration 04boundary 05render
as of 2026-04-26

VNumbers

262K particles in the web prototype is the upper end of what Three.js can keep at 60 fps on the target GPU before per-frame buffer uploads start dominating frame time; the UE 5.4 target of 1M particles is a Niagara compute-shader budget rather than a CPU one, which is why the simulator was built twice in the first place. Plummer softening with ε = 50 UU in GravitationalForce.ush and DensityWaveSpiral.ush avoids the F → ∞ singularity at small radii without falsifying long-range behavior — the simulator has no business reporting an infinite force just because two particles happened to land at the same coordinates. The web prototype uses per-mode scene-unit constants in the 0.5–3.0 range because Three.js world-scale is not Unreal Units, and a single ε would either over-soften some modes or fail to soften others. The 1 634 research-markdown lines above count only the five canonical documents; CLAUDE.md and ancillary notes are excluded.

VIConstraints

Acknowledged limits live in the synthesis document and are a design constraint rather than a failure mode: the de Sitter holography gap is open, the framework does not predict the CMB power spectrum from first principles, Verlinde gravity has observational mismatches at cluster scale that have not been resolved, the background-independence question is unresolved, and there is no proposed direct-detection experiment for the entanglement-substrate model. The simulator does not pretend to resolve any of these. What it does is let an observer watch the consequences of accepting the open problems — if entanglement does generate geometry and gravity does emerge from that substrate, what does a galaxy actually look like at 1M particles with both the Verlinde term and the Ryu-Takayanagi-flavored holography term enabled? That is a different and more honest contribution than a paper claiming closure on problems that are still open.

:/ ESC