#Problem
The file organization IS the system's external memory. When agents (AI or human) don't follow organizational patterns, the system degrades — it forgets. This was observed directly: copying 13 notes into a project without a defined method produced circular references, stale metadata, and broken provenance. Each mistake is a small forgetting event.
This is the central architectural problem. IAR's exponential separation ($O(\log N)$ vs $\Omega(N)$) assumes a well-formed index. Degraded organization collapses toward the unindexed case. The same mechanism — procedural drift causing institutional forgetting — is the core argument of the institutional AI work.
#What degrades
| Layer | Invariant | How it breaks | Detection |
|---|---|---|---|
| File location | Type determines directory | Agent puts file in wrong place | /run-audit file-paths |
| Frontmatter | Keys match document context | Circular related:, stale status: after copy | /run-audit metadata |
| Cross-references | related: is symmetric, source: points to origin | One side updated, other forgotten | /read scan |
| Index files | index.md reflects actual contents | Files added/removed without index update | /run-audit file-paths |
| Library lifecycle | promoted notes have promoted-to: | Note copied but original not updated | /run-audit metadata |
#Principles
- Operations, not edits. File moves, copies, and promotions should go through defined methods (
/write mvdir, pulling-notes method) rather than raw edits. Methods encode invariants.
- Redundant invariants. Multiple overlapping signals: file location + frontmatter + index file + cross-references. If one degrades, others can detect it.
- Immutable provenance. Some fields are write-once:
created:,source:. Agents should never modify these after initial write.
- Detect, don't just prevent. Prevention (methods, skills) reduces drift. Detection (audits) catches what prevention misses. Both are needed.
- Codify after observing. When a new operation causes drift, capture it as a method. The "pulling notes into projects" convention came from exactly this.
#Current defenses
| Defense | Mechanism | Covers |
|---|---|---|
/run-audit | Periodic scan against conventions | File structure, metadata |
/write mvdir | Rename/move with cross-ref fixing | Location changes |
/read scan | Validate all cross-references | Link integrity |
Methods in index.md hierarchy | Documented procedures (e.g., pulling notes) | Copy/promote operations |
index.md | Human-readable manifest per directory | Content inventory |
#Gaps
- No pre-operation validation. Methods are documented but not enforced. An agent can copy a file without following the method.
- No degradation metric. Audits report violations but don't measure overall system health over time. We can't answer "is the system more or less organized than last week?"
- Frontmatter schema undefined. No registry of which keys are valid for which document types. Invalid keys aren't caught.
- No post-operation hook. After a bulk operation (like copying 13 files), nothing triggers an audit automatically.
#Next
- [ ] Define frontmatter key registry (which keys, which types, mutability rules)
- [ ] Add degradation metric to
/run-audit(organizational entropy score) - [ ] Consider post-operation audit triggers
- [ ] Connect to inscription experiment: degradation as experimental condition
haak · created 2026-02-22 · zach + claude
Architecture 04 — Drift Resistance — 2026 — Zachary F. Mainen / HAAK