DEVELOPER TOOLS × AI
Coordination system for AI-driven software development. TypeScript CLI (26 commands) + Claude skills + slash commands working together via the filesystem. Enables agents to work across sessions with knowledge that improves over time, not just accumulates.
agentfocus.devROLE
Creator
YEAR
2024-2025
STACK
TypeScript CLI · Claude Skills · Bash
STATUS
In Use (Personal)
AI coding agents lose coherence as context windows fill. Building complex software requires coordinating multiple agents across hundreds of tasks over weeks or months. Without a system:
Context windows are finite. You need a system that preserves knowledge across sessions without requiring agents to read 1.5M tokens of history.
Knowledge should compress, not accumulate. Each session should make documentation better, not just add to it.
TRADITIONAL (ACCUMULATION)
37 discoveries
Redundant, unorganized, growing
AGENT FOCUS (COMPRESSION)
10 refined discoveries
Organized, essential, improved
Lossless compression: Session 10 doesn't read 1.5M tokens. It gets distilled wisdom from all previous sessions.
Agent Focus has four integrated components working together:
PLANNING
Design & architecture
ORCHESTRATOR
Knowledge management
SCOPES
Session management
MESSAGING
Inter-agent comms
A long-running agent that never codes. Keeps the specification in sync with reality as things change. Reads worker session logs, graduates discoveries into tribal wisdom, then into canonical docs. Coordinates scope-to-scope transitions. No single worker has enough context to transition between scopes, so the orchestrator maintains the project-level view.
Built a CLI (af) using the same Markdown Retrieval architecture from Psych Assessment AI. Catalog-numbered specs (1.1.1, 2.3.4) enable programmatic lookups. Handles project init, scope management, task tracking, agent registry, messaging, view generation, and validation. Generates "database views" of massive specifications filtered per scope. Works with Claude skills and slash commands via the filesystem.
Some projects have 200k+ token specifications. You can't code with that in context. The catalog numbering + CLI lets you slice out exactly what's needed for each scope. Spec can be any size (thousands of pages), system handles it. Could theoretically support hundreds of parallel agents on million-page specs.
Agents work in the 50k-150k "zone" where they're most productive. At 150k, they hand off: create a structured session log, refine the tribal wisdom document (handoff.md), update code architecture maps. The next session loads ~40k tokens of refined knowledge instead of 150k of raw transcript.
Every session is a resumable JSONL file. You can wake agents months later with their full 150k context and ask questions. "Why did you choose approach X?" The agent responds based on actual experience, not documentation guesses.
No databases, no APIs, no infrastructure. Just Markdown, YAML, and JSON files. Git-versioned, human-readable, grep-able. Copy a folder, copy the entire project state.
THE 150K TOKEN BOUNDARY
RAMPING UP
Loading context, exploring
IN THE ZONE
Peak productivity
DEGRADATION
Time to handoff
Handoff at 150k leaves 50k buffer for future wakeups. Tested empirically across dozens of sessions.
100+
SESSIONS
Across multiple projects
200k+
SPEC TOKENS
Handles any size
∞
RESUMABLE
Wake agents months later
Dogfooded extensively. Psych Assessment AI was built across dozens of sessions. Food Science AI's self-improving eval loop ran for days using Agent Focus handoffs.
Peak example: Isotype (personal SaaS project) was built entirely with Agent Focus from greenfield to production. 200k+ token specification, hundreds of sessions across months. The system built itself.
Building complex AI applications requires more than one coding session. Context windows fill up. Knowledge gets lost. Quality degrades.
Agent Focus solves the coordination problem: how to build software with AI agents across weeks or months without losing coherence. Knowledge compression, strategic handoffs, and queryable past sessions create a system where quality compounds instead of degrading.
This is meta-infrastructure: a system for building AI systems. And it's been dogfooded extensively, including building itself.