Big-Loop×Attention + Learnable×Construction

Type: intersection (second-order) Slug: intersection—big-loop-attention-episodic-consolidation Parents: intersection—big-loop-recurrence-attention, intersection—learnable-nature-hippocampal-construction Last updated: 2026-05-14 Epistemic status: Extrapolative


The combination

Big-loop attention integrates information across episodes (first intersection). Construction is structure prediction from stored elements (second intersection). Combined: episodic memory consolidation is big-loop attention over constructed scenes. The hippocampus builds scenes (construction), then iteratively integrates them across days (big-loop attention) into a coherent life narrative.

What emerges

This reframes memory consolidation as a structure-prediction problem. The “structure” being predicted is a coherent autobiographical narrative. The “input” is a sequence of isolated constructed scenes. Big-loop attention iteratively refines the global representation by passing scene representations through the cortex and back, just as AlphaFold iteratively refines its structure prediction by passing residue representations through attention layers.

Gap

Consolidation is typically framed as “memory transfer” (hippocampus → neocortex). This intersection reframes it as “structure inference” — the system is inferring the latent structure that connects separate experiences. No paper in the corpus uses this framing.

Generative potential

Architecture: A “life-narrative” model that takes sequences of constructed scene embeddings and uses big-loop attention to infer the latent structure connecting them. This could model how autobiographical memory becomes coherent over time — not by transferring memories, but by inferring the structure that makes them a story.


Falsification: If consolidation improves memory but does not reorganise scene structure (just strengthens existing representations), the structure-inference claim is false.