Lean CTX

Lean CTX

Installable
yvgude
GitHubnpm

About

A local-first context runtime that compresses file reads and shell output before they reach the LLM, reducing token waste by 60-95% (up to 99% on cached reads). Provides 58 MCP tools for AI coding agents.

Features

  • File reads (MCP): Cached + mode-aware reads (full, map, signatures, diff, etc.) with graph-aware related files hints
  • Shell output compression: Compresses noisy CLI output via 95+ patterns (git, npm, cargo, docker, etc.)
  • Graph-Powered Intelligence: Multi-edge Property Graph (imports, calls, exports, type_ref) with weighted impact analysis, hybrid search (BM25 + embeddings + graph proximity via RRF), and incremental git-diff updates
  • PR Context Packs: lean-ctx pack --pr builds a PR-ready context pack (changed files, related tests, impact, artifacts)
  • Context Packages: Bundles Knowledge + Graph + Session + Gotchas into portable .lctxpkg files
  • Session memory (CCP): Persist task/facts/decisions across chats with structured recovery queries
  • HTTP mode: lean-ctx serve for Streamable HTTP MCP + /v1/tools/call

Installation Methods

# Universal install (no Rust needed)
curl -fsSL https://leanctx.com/install.sh | sh

# macOS / Linux
brew tap yvgude/lean-ctx && brew install lean-ctx

# Node.js
npm install -g lean-ctx-bin

# Rust
cargo install lean-ctx

# Pi Coding Agent
pi install npm:pi-lean-ctx

Setup

After installation, run:

lean-ctx setup
lean-ctx init --agent <agent-name>

Then restart your shell and editor/AI tool.

Read Modes

The server provides 10 read modes:

  • full: Complete file content
  • map: Structural overview
  • signatures: Function/method signatures only
  • diff: Changes only
  • And more...

Integration Modes

  • CLI-Redirect: Agent calls lean-ctx directly via shell (zero MCP schema overhead)
  • Hybrid: MCP for cached reads (13 tokens), CLI for shell + search
  • Full MCP: All 58 tools via MCP protocol

Supported Agents

Works with any MCP-compatible client including Cursor, Codex, Gemini, Claude Code, Windsurf, GitHub Copilot, Cline, Roo Code, VS Code, Zed, Neovim, JetBrains IDEs, and many more.

Privacy

  • No telemetry by default
  • Optional anonymous stats (opt-in during setup)
  • Runs locally; your code never leaves your machine

Benchmarks

See BENCHMARKS.md or run:

lean-ctx benchmark report .

This server runs through your single 1Server connection. No extra config required.

0Installs
--Stars

Categories

AI ToolsDevelopmentProductivity