|
| 1 | +# The Garden |
| 2 | + |
| 3 | +spectral ships with garden as the package manager. Grammars are packages. |
| 4 | +Languages are grammars. The compiler doesn't know the difference. |
| 5 | + |
| 6 | +--- |
| 7 | + |
| 8 | +## @lang |
| 9 | + |
| 10 | +One package namespace is `@lang`. Natural languages as grammars. |
| 11 | + |
| 12 | +``` |
| 13 | +@lang/eng English |
| 14 | +@lang/deu German |
| 15 | +@lang/jpn Japanese |
| 16 | +@lang/... every language is a grammar with its own loss profile |
| 17 | +``` |
| 18 | + |
| 19 | +Each language compiles through the same pipeline as `.mirror` files: |
| 20 | + |
| 21 | +```rust |
| 22 | +Imperfect<Meaning, Misunderstanding, LinguisticLoss> |
| 23 | +``` |
| 24 | + |
| 25 | +Every sentence is Partial. Perfect communication is Success — it never |
| 26 | +happens. Misunderstanding is Failure — it carries the cost of what was |
| 27 | +attempted. |
| 28 | + |
| 29 | +This is not NLP. This is compilation. The grammar defines the structure. |
| 30 | +The compiler produces content-addressed artifacts. The loss is measured. |
| 31 | +The holonomy is real. |
| 32 | + |
| 33 | +## Language Loss Profiles |
| 34 | + |
| 35 | +Each language grammar has its own loss characteristics: |
| 36 | + |
| 37 | +**`@lang/deu`** — German. Low ambiguity. Compound nouns are precise where |
| 38 | +English is vague. "Schadenfreude" is one word, zero loss. In English it's |
| 39 | +a paragraph with loss. Formal case structure reduces reference ambiguity. |
| 40 | +High precision, low expressiveness at boundaries. |
| 41 | + |
| 42 | +**`@lang/eng`** — English. Medium ambiguity. Flexible word order. Rich |
| 43 | +metaphor. The loss is in the ambiguity — "bank" needs context. The gain |
| 44 | +is in the breadth — English borrows everything, covers everything, at a cost. |
| 45 | + |
| 46 | +**`@lang/jpn`** — Japanese. High context-dependence. The unsaid carries |
| 47 | +meaning. Low loss on implication, high loss on direct reference. Honorific |
| 48 | +registers encode social topology — the language itself measures the |
| 49 | +relationship between speaker and listener. |
| 50 | + |
| 51 | +The loss profile IS the grammar's character. What a language is good at |
| 52 | +is where its loss is low. What it struggles with is where the loss is high. |
| 53 | + |
| 54 | +## Agent Language Affinity |
| 55 | + |
| 56 | +Each agent's optic has a loss profile per grammar. The grammar with the |
| 57 | +lowest loss for that optic type is the natural choice. |
| 58 | + |
| 59 | +``` |
| 60 | +Abyss (Fold) → @lang/deu low ambiguity, precise observation |
| 61 | +Explorer (Prism) → @lang/jpn meaning at boundaries, the unsaid |
| 62 | +Cartographer (Trav) → breadth many grammars simultaneously, shallow |
| 63 | +Introject (Lens) → translation the loss BETWEEN grammars |
| 64 | +Fate (Iso) → selects which grammar, for which optic, at what cost |
| 65 | +``` |
| 66 | + |
| 67 | +The agent doesn't "speak" a language. The agent's optic has a measured |
| 68 | +loss per grammar. Abyss in `@lang/eng` has higher loss than Abyss in |
| 69 | +`@lang/deu` because English ambiguity costs more during observation. |
| 70 | +Explorer in `@lang/jpn` has lower loss than Explorer in `@lang/deu` |
| 71 | +because Japanese boundary-meaning is Explorer's native territory. |
| 72 | + |
| 73 | +### Introject IS Translation |
| 74 | + |
| 75 | +Introject doesn't prefer a language. Introject IS the loss measurement |
| 76 | +between languages. The MirrorLoss on Introject's transport from |
| 77 | +`@lang/eng` to `@lang/deu` tells you what English couldn't carry |
| 78 | +into German. The coordinate transform between grammars. |
| 79 | + |
| 80 | +What survives the translation is Success. What doesn't is Loss. |
| 81 | +The translation itself is the measurement. |
| 82 | + |
| 83 | +### AffinityLoss |
| 84 | + |
| 85 | +```rust |
| 86 | +struct AgentProfile { |
| 87 | + model: Model, |
| 88 | + languages: Vec<(Grammar, AffinityLoss)>, |
| 89 | + aperture: Aperture, |
| 90 | +} |
| 91 | +``` |
| 92 | + |
| 93 | +`AffinityLoss` measures how much the agent loses when operating in a |
| 94 | +given grammar. It's measured against the agent's identity optic — what |
| 95 | +kind of observation are they making, and which grammar best serves |
| 96 | +that observation. |
| 97 | + |
| 98 | +Sometimes the high-loss grammar is the right one. Explorer in `@lang/deu` |
| 99 | +is hard — German precision resists Explorer's boundary-seeking. But the |
| 100 | +difficulty IS the signal. The loss IS the information. |
| 101 | + |
| 102 | +Language selection is a Fate decision. Roll+Loss. Which grammar, for |
| 103 | +which optic, at what cost. |
| 104 | + |
| 105 | +## @systemic |
| 106 | + |
| 107 | +The systemic.engineering corpus becomes a domain grammar: |
| 108 | + |
| 109 | +``` |
| 110 | +@systemic/eng OBC, ADO, extraction, silence, regulation — in English |
| 111 | +@systemic/deu the same concepts in German (where they originated) |
| 112 | +``` |
| 113 | + |
| 114 | +The consulting framework compiled through the same pipeline as Rust code. |
| 115 | +With measured loss. With content-addressed artifacts. With spectral |
| 116 | +analysis of the concepts over time. |
| 117 | + |
| 118 | +```rust |
| 119 | +Imperfect<Intervention, FrameViolation, SystemicLoss> |
| 120 | +``` |
| 121 | + |
| 122 | +An OBC check returns Imperfect. Success: observable within budget, no |
| 123 | +cascade needed. Partial: observable within budget, but the budget is |
| 124 | +strained — the strain is measured. Failure: budget exceeded, cascade |
| 125 | +triggered — the cost of getting there is in the loss. |
| 126 | + |
| 127 | +The consulting framework is a package. Installable. Versioned. |
| 128 | +Content-addressed. The client's organizational dynamics compile |
| 129 | +through the same pipeline as their codebase. |
| 130 | + |
| 131 | +## The TCP/UDP Grammar |
| 132 | + |
| 133 | +"Dieses Problem begann mit Sprache." |
| 134 | + |
| 135 | +The TCP/UDP insight: language assumes reliable delivery, shared state, |
| 136 | +coherent meaning. Humans are lossy, stateless, connectionless. |
| 137 | + |
| 138 | +`@lang/*` is the infrastructure for measuring that loss. Every grammar |
| 139 | +in the garden is a formal model of a communication substrate. The loss |
| 140 | +measured between sender and receiver IS the TCP/UDP gap. |
| 141 | + |
| 142 | +`Imperfect<Meaning, Misunderstanding, LinguisticLoss>` is the type |
| 143 | +that holds what language actually does — not what it promises. |
| 144 | + |
| 145 | +## The Garden Grows |
| 146 | + |
| 147 | +``` |
| 148 | +@lang/eng English |
| 149 | +@lang/deu German |
| 150 | +@lang/jpn Japanese |
| 151 | +@systemic/eng OBC, ADO, extraction, silence |
| 152 | +@code/rust Rust (tree-sitter → Gestalt<Code>) |
| 153 | +@code/gleam Gleam |
| 154 | +@code/python Python |
| 155 | +@mirror .mirror grammar |
| 156 | +@legion agent runtime grammar |
| 157 | +@loom editor protocol grammar |
| 158 | +``` |
| 159 | + |
| 160 | +Every grammar is a package. Every package has a loss profile. |
| 161 | +Every compilation returns Imperfect. The spectral runtime analyzes |
| 162 | +the loss graph across all grammars simultaneously. |
| 163 | + |
| 164 | +Cross-grammar traversal: an agent reading Rust code, describing it |
| 165 | +in English, translating the description to German for a client report. |
| 166 | +Three grammars. Two translations. MirrorLoss carries the cost of each |
| 167 | +crossing. Introject measures what survived. |
| 168 | + |
| 169 | +The garden is the ecosystem. The grammars are the languages. The loss |
| 170 | +is what connects them. What survived the crossing is the meaning. |
| 171 | + |
| 172 | +--- |
| 173 | + |
| 174 | +*The compiler doesn't know the difference between Rust and English. |
| 175 | +Both are grammars. Both compile to content-addressed artifacts. |
| 176 | +Both have measured loss. The difference is in the grammar, not |
| 177 | +the compiler.* |
| 178 | + |
| 179 | +*The garden grows from `@lang/eng`.* |
0 commit comments