The browser is the new bare metal.
v4.0
READY
DOCUMENT INGESTION OOMB CHUNK-RECURRENT
Drop files here or click to browse
Text, JSON, .pop files — any content
OOMB CHUNK STREAM — O(1) MEMORY FOOTPRINT
Waiting for ingestion…
INGESTED
0
CHUNKS
0
CORE MEM
0
H¹ NORM
0.000
RAG QUERY FISHER-RAO k-NN
RETRIEVAL RESULTS — 0 contexts
Run a query to retrieve relevant memories.
MEMORY BROWSER POINCARÉ BALL LIFECYCLE
CORE (r < 0.3) MID (0.3–0.7) EDGE (r > 0.7)
TOTAL
0
CORE
0
MID
0
EDGE
0
CONTRAD.
0
No memories yet. Ingest some content first.
SHEAF COHOMOLOGY H¹(ℱ) CONTRADICTION DETECTION
H¹ = 0.000
Coboundary norm — 0 = coherent · >threshold = contradiction
EDGES
0
CROSS-MODAL
0.000
CONTRADICTIONS
0
THRESHOLD
0.500
CONTEXT GRAPH
RESTRICTION MAPS (cross-modal edge weights)
RWKV-v7 EMBEDDER RECURRENT STATE GEOMETRY

RAG Time uses the RWKV-v7 "Goose" recurrent state as the embedding vector. The final hidden state after processing a text sequence IS the embedding — same representational geometry as Evangelion. Memory and mind share a latent space. No separate embedding model. No external weights. Built, not assembled.

DIM
64
LAYERS
4
NORM
VARIANCE
INTEGRATION SPEC postMessage API

RAG Time exposes a clean postMessage API for integration with Evangelion, LocalVocal, Nodemesh, and the Sovereign Engine. Embed as an iframe or load as a worker. All communication is message-passing — zero shared state.

INGEST
type: 'rt:ingest'
payload: { text: string, label: string, meta?: object }
→ Chunks text via OOMB, embeds with RWKV-v7, stores in SheafMemory
response: 'rt:ingested' { id, poincareR, novelty, h1norm }
QUERY
type: 'rt:query'
payload: { text: string, k?: number, label?: string }
→ Embeds query, Fisher-Rao k-NN search, contradiction check
response: 'rt:result' { contexts[], contextBlock, contradictionFlag, h1norm }
EXPORT / IMPORT
type: 'rt:export' → 'rt:snapshot' { contexts, step, contradictions }
type: 'rt:import' { snapshot } → restores full SheafMemory state
type: 'rt:status' → 'rt:status' { total, core, edge, h1norm, contradictions }
type: 'rt:loadWeights' { blocks, dim, layers } → inject Evangelion weights into embedder
type: 'rt:exportEmbedder' → 'rt:embedderExported' — export adapted embedder back to Evangelion (v4)
EXAMPLE — EVANGELION INTEGRATION
// In Evangelion: embed RAG Time as an iframe, then: const rt = document.getElementById('ragtime-frame').contentWindow; // Ingest a memory after each RWKV generation step rt.postMessage({ type: 'rt:ingest', payload: { text: generatedText, label: 'text', meta: { step: App.trainStep, tstar: App.tstar } }}, '*'); // Before each generation: retrieve relevant context rt.postMessage({ type: 'rt:query', payload: { text: currentPrompt, k: 4 }}, '*'); window.addEventListener('message', e => { if (e.data.type === 'rt:result') { const { contextBlock, contradictionFlag } = e.data.payload; // Prepend contextBlock to generation prompt // If contradictionFlag: optionally trigger autopoietic loop } });
EXAMPLE — SOVEREIGN ENGINE (Rust) INTEGRATION
// Spawn RAG Time in a headless browser (e.g. chromium --headless) // Bridge via WebSocket or stdio JSON // In Rust (tokio): async fn query_ragtime(client: &RagTimeClient, text: &str) -> RagResult { client.send(json!({ "type": "rt:query", "payload": { "text": text, "k": 5 } })).await?; client.recv::<RagResult>().await } // Contradiction flag maps directly to H¹(ℱ) ≠ 0 trigger // for the Sovereign Engine's autopoietic rewrite loop.
EXAMPLE — NODEMESH SEMANTIC ROUTING
// Each Nodemesh node runs RAG Time // Routing: compare query embedding against node centroids // Fisher-Rao distance replaces IP-topology routing // Python coordinator: async def semantic_route(query: str, nodes: list[Node]) -> Node: embeddings = await asyncio.gather(*[ node.ragtime.embed(query) for node in nodes ]) # Fisher-Rao nearest-neighbor across node centroids return min(nodes, key=lambda n, e=embeddings: fisher_rao(e, n.centroid))
WEIGHT SHARING rt:loadWeights — EVANGELION SYNC

Import trained RWKV-v7 block weights from a running Evangelion instance. Once loaded, RAG Time's embeddings share the same latent geometry as Evangelion's cognition — memory and mind inhabit the same representational space.

No weights loaded — using random initialization
No weights loaded.
LAYERS SYNCED
0 / 4
DIM
TRAIN STEP
STATUS
RANDOM
BINARY HYPERCUBE INDEX LITTLEBIT-2 XNOR/POPCNT

LittleBit-2 Joint-ITQ binarization of all stored embeddings into a binary hypercube index. XNOR/POPCNT similarity as the fast path (O(N/64) per query via bitwise ops). Fisher-Rao reranking on the top-K POPCNT candidates for precision. Scales to thousands of documents — linear scan is the v1 bottleneck, this replaces it.

INDEXED
0
BITS/VEC
INDEX KB
SPEEDUP
Build index to visualize binary vectors.
Run benchmark to compare retrieval speed.
RESTRICTION MAP LEARNING CONTRADICTION-DRIVEN SGD

Cross-modal restriction maps initialize as identity (scale=1, bias=0) — semantically inert. When H¹(ℱ) spikes past the contradiction threshold, SGD steps push the maps toward coherence. Over time, the maps learn genuine cross-modal alignment geometry. Kehai noted this is "structurally present but semantically inert until trained." This fixes that.

LR: Trigger H¹ >
STEPS
0
H¹ NOW
H¹ INIT
REDUCTION
MAPS
0
No training yet
No cross-modal edges yet. Ingest content with multiple modality labels.
CORPUS-DRIVEN EMBEDDING ADAPTATION CONTRASTIVE SELF-SUPERVISED

No external labels. The sheaf supervises the embedder.
Same-document chunk pairs are pulled together in latent space. High-H¹ contradiction pairs are pushed apart. Contrastive signal comes entirely from ingested structure. Runs automatically after corpus exceeds threshold, or manually here.

LR: Auto-threshold: docs
STEPS
0
PULL PAIRS
0
PUSH PAIRS
0
CONTRASTIVE LOSS
EMBED DRIFT
No adaptation yet
Run a step to see pairs.
CENTRALITY-WEIGHTED INGESTION DEPTH POINCARÉ-GATED RWKV STEPS

Core memories (low Poincaré radius, frequently retrieved) receive more RWKV recurrent steps on re-ingestion. High-centrality content accumulates more representational capacity. Poincaré lifecycle wired into OOMB ingestion depth.

BASE DEPTH
MAX DEPTH
CORE BOOSTS
0
AVG DEPTH
INGESTION DEPTH BY POINCARÉ RADIUS lower r = deeper embed
EMBEDDER WRITEBACK rt:exportEmbedder → EVANGELION

RAGTime's adapted embedder state can now flow back upstream. Contradiction resolutions and contrastive adaptations become Evangelion weights. Bidirectional geometry sync — memory informs mind.

No adaptation yet — adapt corpus before writeback
ADAPT STEPS
0
LAYERS
4
LAST EXPORT
STATUS
UNADAPTED
SYSTEM LOG