An apparatus for the patient
Athanor
A slow fire for turning raw material into refined work — together with the machine, not in spite of it.
iWhat if you could
What if the way most people are using AI right now is backwards?
The dominant story says: build agents that need less human input. Hand them the problem, walk away, return to a deliverable. The human is a constraint to minimize. The agent is the worker. Capability scales by removing you from the loop.
But ask anyone who's actually used these tools at depth: the output is grey. Competent, plausible, untrustworthy in the same way a well-meaning stranger's advice is untrustworthy. It reads like every other thing that came out of a model. Marketers have a name for it now. Slop.
Slop is what you get when you ask the average of every pattern to speak for you. Slop scales beautifully. Slop is also worthless when distinction is the product.
Athanor is a bet that the orthodoxy has it the wrong way around. The point is not to remove the human. The point is to make the human's taste cheap to inject — at every layer, at every step, into every artifact.
The model is the string. You are the body of the instrument.
iiThe shift
Two opposite bets about how this works.
Autonomy is the moat.
The agent does more. The human does less. The work scales because the agent's responsibility expands. Quality follows from better models, better tools, better orchestration — not from human attention.
What it produces: output at volume that reads, at scale, like every other agent's output.
What it costs: the work loses a center. There's nobody home in it.
Cognitive overlap is the moat.
The model produces patterns. The human applies taste. The work scales because injecting taste becomes cheap — not because taste becomes optional. Quality follows from the human's continued presence in the loop, made possible by structure that respects their time.
What it produces: work with a center. A signature you can hear.
What it costs: patience. This is not a one-shot system.
The two bets sound almost like style choices. They're not. They lead to opposite architectures. If autonomy is the moat, you build agents that own outcomes and minimize check-ins. If cognitive overlap is the moat, you build a substrate where the human can land at every meaningful decision without paying re-orientation tax. The substrate is most of the work.
Athanor is the substrate.
iiiThe metaphor
The string and the body.
A guitar string vibrates. That's a sound. It's not yet music.
What turns the vibration into music is the body of the instrument: shape, wood, tension, bracing, cavity. The body modulates what comes off the string. Without the body, you have a frequency. With the body, you have a voice.
Large language models are strings. They produce vibration — patterns, the average of patterns, the slop on a flat plane. What you do with that vibration depends entirely on the body you build around it.
In Athanor:
The source material is the reality barrier. Files, transcripts, brand documents, code, anything that came from outside the system. Source is sacred — it is the only thing in the apparatus that the AI never edits, ever, under any circumstance. It is your tether to the world. It is what the body is anchored to.
The frames are the body's geometry. Composed views the human builds, in conversation with the AI, over the source. Frames decide what gets pulled, what gets surfaced, what gets ignored. Frames are where taste lives in structural form.
The leaves are what gets generated through the body — refined output with full provenance back to source. Each leaf knows which frame produced it, which branch positioned it, which prompt invoked it, and which source leaves it derived from.
The summons are pre-curated context payloads the human has reviewed, pruned, edited, and locked. They are taste solidified. They are reusable across generation runs.
And every generation pluck — every act of asking the model to vibrate — passes through this body. The result is not slop. It has edges. It has a signature. It came out of your apparatus, not the gray middle.
Get high with your AI.
Don't outsource the trip.
ivThe rhythm
Slow, deliberate, iterative — like how anything good actually gets made.
The system has a rhythm. Once you know it, you stop having to think about which lever to pull. There are six beats. They repeat.
Ingest
Pull source material into the apparatus. Each file becomes a leaf, stringified, hashed, token-counted, immutable. The reality barrier.
Compose
Build a frame in conversation with the AI. The AI proposes structure based on what it sees in source; you push back, edit, lock. The frame is your structural opinion.
Summon
Pull candidate leaves into a curation view. Prune the irrelevant, edit the imperfect, regenerate the weak, add what's missing. Lock the result as a summon.
Generate
Spawn one agent per branch. Each agent receives the locked summon as context, the frame's prompt template, and a hard output budget. Returns refined leaves.
Roll up
Summary leaves are written at parent branches, condensing children. The next iteration can pull these summaries instead of the heavy children — the apparatus stays cheap.
Iterate
Compose the next frame, deriving from any branch in any prior frame. Some children become canon. Some get abandoned. The exploration is a tree, not a line.
Taste injection at every seam.
Notice what's not in the rhythm: a step where the AI runs unchecked. There isn't one. Every transition between beats is a place where the human can land — review candidates, edit content, kill an iteration. The architecture is built so landing is cheap. You don't pay re-orientation tax to drop in. The frame is already there. The summons are named. The provenance is already assembled.
This is what the orthodoxy gives up when it optimizes for autonomy: the inexpensive moments where the human's signature can mark the work.
vThe exploration tree
Most of what you make will not survive. That's the point.
Loomwork, real loomwork, is mostly about choosing which threads to cut.
Athanor doesn't pretend that ideas arrive perfect. The way work actually gets made — by graphic designers, writers, architects, anyone iterating toward something — is by spawning many candidates and choosing among them. Concept 1, then 1A, 1B, 1C; then 1A-1, 1A-2; then a branch you abandon and a branch you canonize.
The system is built for this. Every frame you compose can be iterated, deriving children that re-pin some leaves you trust and regenerate others. Frames carry a status — draft, explored, canon, abandoned — and the orchestrator preferentially pulls from canon when summoning context. Canon doesn't mean only-one. It means leyline. There can be many.
The leaves themselves outlive any single frame. A leaf you generated in concept-1A-2 can be pinned by a branch in concept-3B if it turns out to matter there. No copy. No data move. The artifact is the leaf; the frames are just the views you happened to be looking through when you minted it.
This is what makes the iteration cheap: truth gets locked in slowly, through use, not asserted at the start. You don't have to know what the right structure is on day one. You let it reveal itself across iterations, and you mark canon when you see it.
viThe architecture
The body, in five layers.
What's actually under the hood, named honestly:
The reality barrier
Files ingested via path-based buckets with gitignore-style rules. Stringified at the boundary. Token-counted at write. Immutable.
Composed structure
Three kinds: source (auto, flat), analytical (organize what exists), generative (carry a prompt template, spawn new leaves). Frames branch into a DAG via lineage.
parent_frame_id set. Multiple canon frames can co-exist as parallel chosen-truth paths.
Positions in the tree
Adjacency list plus materialized path. Each branch carries a query_spec describing what to summon, an optional output_budget, and pinned leaf IDs.
Curated context, locked in
A named, reusable bundle of leaf IDs the human has reviewed and accepted. The output of the curation loop. The unit of taste at the layer between query and generation.
parent_summon_id). Locked summons are forever — they are the saved decisions.
Atoms with provenance
The artifacts. Source-flagged or generated. Generated leaves carry derived_from_leaf_ids back to source. Pinnable by any branch in any frame.
The runtime
Convex provides the data layer, reactive queries, typed mutations, and scheduled actions for fan-out generation. The agent runtime is per-branch — one Convex action per leaf to be generated, in parallel, with the full summon as context. The orchestrator is a single high-capability model (Opus); the per-branch generators are cheaper (Sonnet, sometimes Haiku for low-stakes summarization). A pre-tool-use hook blocks any AI-authored mutation to the schema or data exports — invariant I-1 is enforced structurally, not by policy.
The 40% rule
No context summon ever exceeds 40% of the working window. Over-budget summons trigger pyramid summarization: the input is chunked, the same prompt runs against each chunk in parallel, results are condensed in waves until they fit. The threshold is per-Project configurable, validated at runtime against the active model's actual context limit. The apparatus refuses to make itself dumb.
Pass the blunt. Trade riffs. The patterns from reality
are the only law.
viiWhy it pays
Distinction is the only defensible product in a world full of agents.
Every other consultant, agency, and freelancer in your category has access to the same models you do. The price of generic output is collapsing toward zero. What does not collapse: work that has a specific person's signature on it, work that reads like nobody else could have made it.
Athanor is the apparatus that makes signature output cheap to produce at scale. Three claims:
Volume of distinctive output per skilled operator. The structure means the operator's taste is the bottleneck — not their typing speed, not their context-switching, not their fight to remember last week's decisions.
Trips to the slop bin. Outputs come from locked summons of curated source — not from a model's averaged guess at what you might want. Distinction is enforced by the body of the instrument, not hoped for at the prompt.
Reusable taste. Frames, summons, and canon paths from one project become starting material for the next. The apparatus gets sharper across engagements. Your earlier work compounds.
What Athanor is for.
Any work where the deliverable is text, where distinction matters, and where volume is real. Marketing sites with dozens of locally-targeted landing pages. Multi-market SEO content for clients who need each page to feel handmade. Long-form research where source fidelity must be auditable. Agency-scale copy production where the partner's voice has to thread through every artifact. Knowledge management for individuals or small teams whose interpretive lens is the entire product.
What it is not for: tasks where the human's taste isn't the differentiator. If the work is genuinely "produce competent text from a brief, no signature required" — Athanor is overkill. Use a regular LLM with a regular prompt. Save Athanor for work where the moat is you.
The proving ground.
The first corpus running through Athanor is a real, paying engagement: a marketing site for a friend's business, alongside a library of React component patterns. The system ingests both — the brand corpus and the pattern corpus — and produces site sections by scatter-casting patterns across brand context. The deliverable funds the next iteration. The next iteration sharpens the apparatus. By engagement three, the apparatus is its own product.
viiiClosing
The bet, plainly.
Most AI tooling is racing toward agents that need you less. Athanor is built on the opposite bet: that what's actually scarce is human attention applied at the right seams — and that the seams need infrastructure if attention is going to land cheaply.
The apparatus is not a product yet. It's a working substrate, currently being assembled to do real work. The invitation is to help shape it while the structure is still soft, and to see what becomes possible when the bet pays.
If you've ever felt the slop and known there had to be a way through that didn't involve giving up on AI or giving up on having a voice — that's the territory.
There are many strings.
The body is what makes the difference.