The Context Window Problem
AI coding assistants like Claude Code are powerful but expensive. Every file you open, every reference you make, consumes tokens from your context window and your budget. For large codebases, these costs add up fast, making AI-assisted development prohibitively expensive for many teams.
A 98% Reduction
Developer Michał Kowieski has released an MCP (Model Context Protocol) server that addresses this head-on. By intelligently managing how much context gets sent to Claude, the server reduces token consumption by 98% without meaningfully degrading code quality.
The approach focuses on context mode—selectively loading only the most relevant code sections rather than entire files or directories. Instead of sending thousands of lines of peripheral code, the server sends just what Claude needs to complete the specific task at hand.
Why This Matters
At current API pricing, a 98% reduction changes the economics of AI coding. What might have cost $50 per day of heavy use now costs $1. This puts AI-assisted development within reach of individual developers, bootstrapped startups, and teams with limited budgets.
More importantly, it enables new workflows. Developers can afford to keep Claude active longer, iterate more freely, and explore larger portions of their codebase without worrying about racking up charges.
How It Works
The MCP server sits between your IDE and Claude Code, intercepting requests and intelligently filtering what gets sent upstream. It uses heuristics about code relevance, dependency graphs, and task context to make smart decisions about what to include and what to omit.
The server is open source and can be self-hosted, giving teams full control over their context management strategies.
Takeaway
Context optimization is becoming a crucial layer in the AI coding stack. As models get more capable, the bottleneck increasingly becomes what we can afford to show them. Solutions that extend our context budgets—like this 98% reduction—multiply the value we can extract from AI assistants.
Image credit: Mksg