يقلل LeanCTX استهلاك الرموز لنموذج اللغة بنسبة تصل إلى 99% من خلال ملف Rust واحد يوفر: Shell Hook مع أكثر من 95 نمط ضغط CLI، وخادم MCP مع 58 أداة للقراءات المخزنة وذكاء الجلسة، ورسم بياني مستمر للمشروع لفهم بنية الكود. ثبّته مرة واحدة - وكل قراءة ملف وأمر شل وتنقل في الكود يصبح أكثر كفاءة.
مولد الأوامر
اختر منصتك وأداة الذكاء الاصطناعي أدناه. يمكن لصق الأمر المولد في أي مساعد برمجة بالذكاء الاصطناعي.
ابدأ في 3 خطوات
# 1. Install (pick one) curl -fsSL https://leanctx.com/install.sh | sh # universal brew tap yvgude/lean-ctx && brew install lean-ctx # macOS / Linux npm install -g lean-ctx-bin # Node.js cargo install lean-ctx # Rust pi install npm:pi-lean-ctx # Pi Coding Agent # 2. Setup (auto-configures shell + ALL detected editors) lean-ctx setup # 3. Restart your shell source ~/.zshrc # or ~/.bashrc
lean-ctx setup يكتشف ويهيئ تلقائياً: Cursor وClaude Code وWindsurf وVS Code / Copilot وCodex CLI وGemini CLI وZed وAntigravity وOpenCode وCrush وPi. شغّل lean-ctx doctor للتحقق من كل شيء.
تعليمات التثبيت المفصلة حسب المنصة
الخطوة 1: تثبيت البرنامج
المثبت الشامل (أي منصة)
أسرع طريقة لتثبيت lean-ctx بدون أي متطلبات مسبقة:
curl -fsSL https://leanctx.com/install.sh | sh The install script detects your OS and architecture, downloads the correct binary, and places it in /usr/local/bin. It works on macOS, Linux, and WSL.
If you prefer to review the script before running it:
curl -fsSL https://leanctx.com/install.sh -o install.sh
chmod +x install.sh
./install.sh --download # pre-built binary (no Rust)
./install.sh # build from source (requires Rust) npm (Node.js)
ثبّت البرنامج المترجم مسبقاً عبر npm - لا يحتاج Rust:
npm install -g lean-ctx-bin يحمّل البرنامج الصحيح لمنصتك أثناء postinstall مع تحقق SHA256.
macOS
Option A: Homebrew (recommended)
brew tap yvgude/lean-ctx
brew install lean-ctx Homebrew handles updates and PATH automatically.
Option B: Cargo (build from source)
cargo install lean-ctx If lean-ctx isn't found after install, add Cargo's bin directory to your PATH:
echo 'export PATH="$HOME/.cargo/bin:$PATH"' >> ~/.zshrc
source ~/.zshrc Option C: Manual binary download
- Download the binary for your architecture from GitHub Releases:
lean-ctx-aarch64-apple-darwin.tar.gz(Apple Silicon / M1+)lean-ctx-x86_64-apple-darwin.tar.gz(Intel)
- Extract and move to PATH:
tar xzf lean-ctx-aarch64-apple-darwin.tar.gz chmod +x lean-ctx sudo mv lean-ctx /usr/local/bin/
If macOS blocks the binary with a Gatekeeper warning, remove the quarantine attribute:
xattr -d com.apple.quarantine /usr/local/bin/lean-ctx Linux
Option A: Homebrew
brew tap yvgude/lean-ctx && brew install lean-ctx Option B: AUR (Arch Linux)
# Source build from crates.io
yay -S lean-ctx
# Or pre-built binary from GitHub
yay -S lean-ctx-bin Works with any AUR helper: yay, paru, trizen, etc.
Option C: Cargo (build from source)
cargo install lean-ctx Requires Rust toolchain. Install via rustup.rs if needed.
Option D: Manual binary download
- Download
lean-ctx-x86_64-unknown-linux-gnu.tar.gzfrom GitHub Releases - Extract and move to PATH:
tar xzf lean-ctx-x86_64-unknown-linux-gnu.tar.gz chmod +x lean-ctx sudo mv lean-ctx /usr/local/bin/
| Command | Method |
|---|---|
lean-ctx-x86_64-unknown-linux-musl.tar.gz | x86_64 - statically linked (musl) |
lean-ctx-aarch64-unknown-linux-musl.tar.gz | ARM64 / aarch64 (Graviton, Raspberry Pi 4+, etc.) - statically linked |
Windows
Option A: Binary download (recommended)
- Download
lean-ctx-x86_64-pc-windows-msvc.zipfrom GitHub Releases - Extract the
.zipfile - Open the extracted folder - you'll see
lean-ctx.exe - Move
lean-ctx.exeto a folder in your PATH, for example:mkdir %USERPROFILE%\bin move lean-ctx.exe %USERPROFILE%\bin\ - Add the folder to your PATH if it isn't already:
- Open Settings → System → About → Advanced system settings
- Click Environment Variables
- Under User variables, select
Path, click Edit, then New - Add the path to your bin folder (e.g. C:\Users\you\bin)
- Open a new PowerShell or CMD window and verify:
lean-ctx --version
Option B: Cargo (build from source)
- Install Rust if not already installed
- Build and install:
cargo install lean-ctx - The binary is placed in
%USERPROFILE%\.cargo\bin\which is usually already in your PATH.
Option C: Build from source (clone & compile)
git clone https://github.com/yvgude/lean-ctx.git
cd lean-ctx\rust
cargo build --release
copy target\release\lean-ctx.exe %USERPROFILE%\bin\ WSL users: If you're running WSL, follow the Linux instructions instead.
البناء من المصدر (أي منصة)
git clone https://github.com/yvgude/lean-ctx.git
cd lean-ctx/rust
cargo build --release البرنامج موجود في target/release/lean-ctx (أو lean-ctx.exe على Windows). انسخه إلى مجلد في PATH.
الخطوة 2: الإعداد
بعد التثبيت، شغّل lean-ctx setup لتهيئة كل شيء تلقائياً:
lean-ctx setup هذا الأمر الواحد:
- يثبّت 23 اختصار شل (git وnpm وcargo وdocker وkubectl وcurl والمزيد)
- يكتشف تلقائياً المحررات المثبتة (Cursor وClaude Code وWindsurf وVS Code/Copilot وCodex CLI وGemini CLI وZed وAntigravity وOpenCode وCrush)
- ينشئ ملفات تهيئة MCP لكل محرر مكتشف
- يحقن قواعد الوكيل في التهيئة العامة لكل أداة مكتشفة
- يشغّل
lean-ctx doctorللتحقق
مهم: أعد تشغيل IDE بعد الإعداد!
يجب إعادة إنشاء اتصال MCP ليعمل الأدوات والقواعد الجديدة. أغلق IDE بالكامل (Cmd+Q / Ctrl+Q، ثم أعد الفتح).
بعد الإعداد، أعد تشغيل الشل:
| الشل | أمر إعادة التشغيل |
|---|---|
| Zsh (macOS default) | source ~/.zshrc |
| Bash | source ~/.bashrc |
| Fish | source ~/.config/fish/config.fish |
| PowerShell | Close and reopen PowerShell |
ثم أعد تشغيل المحرر بالكامل.
البديل اليدوي
إذا كنت تفضل تثبيت اختصارات الشل فقط بدون اكتشاف المحرر تلقائياً:
lean-ctx init --global ثم هيئ محررك يدوياً باستخدام التعليمات في الخطوة 3.
التهيئة المبنية على eval (موصى به)
كبديل لـ lean-ctx init --global، يمكنك استخدام نمط eval - نفس الطريقة المستخدمة بواسطة starship وzoxide وatuin. يتم طباعة كود الخطاف إلى stdout بواسطة الملف الثنائي المثبت حاليًا، لذا فهو محدّث دائمًا بعد الترقيات:
# bash: add to ~/.bashrc
eval "$(lean-ctx init bash)"
# zsh: add to ~/.zshrc
eval "$(lean-ctx init zsh)"
# fish: add to ~/.config/fish/config.fish
lean-ctx init fish | source
# powershell: add to $PROFILE
lean-ctx init powershell | Invoke-Expression الميزة: تولّد طريقة eval الخطافات من الملف الثنائي المثبت عند بدء تشغيل الشل، لذا لن تكون الخطافات قديمة أبدًا بعد lean-ctx update. لا حاجة لتشغيل lean-ctx setup مرة أخرى بعد الترقية.
تثبيت Docker
لاستخدام lean-ctx داخل حاوية Docker (مثلاً مع Claude Code أو Codex أو وكلاء ذكاء اصطناعي آخرين)، استخدم الملف الثنائي musl المرتبط ثابتياً وعيّن BASH_ENV لتحميل ربط الصدفة في الأصداف غير التفاعلية.
ARG LEAN_CTX_VERSION=3.5.1
# Download the musl binary (statically linked, no glibc dependency)
RUN curl -fsSL \
"https://github.com/yvgude/lean-ctx/releases/download/v${LEAN_CTX_VERSION}/lean-ctx-x86_64-unknown-linux-musl.tar.gz" \
| tar -xz -C /usr/local/bin lean-ctx && \
chmod +x /usr/local/bin/lean-ctx
# Install shell hook (non-interactive) + configure your AI tool
RUN lean-ctx init --global && \
lean-ctx init --agent claude
# Load shell hook in non-interactive shells
ENV BASH_ENV="/root/.lean-ctx/env.sh"
# Claude Code: sources this file before each command
ENV CLAUDE_ENV_FILE="/root/.lean-ctx/env.sh" For ARM64 hosts (AWS Graviton, Apple Silicon via Docker), use the aarch64 binary instead:
# For ARM64 (Apple Silicon / Graviton), use the aarch64 binary:
RUN curl -fsSL \
"https://github.com/yvgude/lean-ctx/releases/download/v${LEAN_CTX_VERSION}/lean-ctx-aarch64-unknown-linux-musl.tar.gz" \
| tar -xz -C /usr/local/bin lean-ctx لماذا يعمل هذا
| الخطوة | الغرض |
|---|---|
musl binary | مرتبط ثابتياً - يعمل على أي توزيعة Linux دون تثبيت gcc-libs أو glibc |
lean-ctx init --global | يثبّت ربط الصدفة في ~/.bashrc (غير تفاعلي، على عكس lean-ctx setup) |
lean-ctx init --agent claude | يكتب تكوين خادم MCP في ~/.claude.json. استبدل claude بوكيلك: cursor، codex، gemini، إلخ. |
BASH_ENV | الصدفات غير التفاعلية العامة (bash -c) تحمّل هذا الملف. معظم ملفات ~/.bashrc تنتهي مبكرًا في الوضع غير التفاعلي - env.sh لا يحتوي على هذا الحارس. |
CLAUDE_ENV_FILE | Claude Code يحمّل هذا الملف قبل كل أمر يُنفّذه. هذه هي الطريقة الموصى بها رسميًا لتكوين بيئة الصدفة لـ Claude Code. |
مشاكل شائعة
| الخطأ | السبب | الحل |
|---|---|---|
command not found: _lc | لم يتم تعيين BASH_ENV - لم يتم تحميل ربط الصدفة | أضف ENV BASH_ENV="/root/.lean-ctx/env.sh" (عدّل المسار للمستخدمين غير root) |
lean-ctx setup hangs | موجه تفاعلي بدون stdin في Docker | استخدم lean-ctx init --global بدلاً من ذلك |
| Binary not found / exec format error | بنية خاطئة أو ملف ثنائي gnu بدون glibc | استخدم أرشيف musl للحاويات الصغيرة |
الخطوة 3: التحقق
- Check the binary is installed:
lean-ctx --version # → lean-ctx 3.5.1 lean-ctx doctor # checks PATH, config, aliases, MCP - Test the shell hook:
git status # output is now compressed lean-ctx gain # shows token savings so far - Open your AI coding tool and start coding - lean-ctx tools should be available automatically.
تهيئة المحرر يدوياً
فقط إذا لم يعمل lean-ctx setup
Connect lean-ctx to your AI coding tool by adding it as an MCP server.
Before you start: Find your lean-ctx binary path. You'll need it for editors that require a full path.
# macOS / Linux:
which lean-ctx
# Example output: /opt/homebrew/bin/lean-ctx or /Users/you/.cargo/bin/lean-ctx
# Windows (PowerShell):
where.exe lean-ctx
# Example output: C:\Users\you\.cargo\bin\lean-ctx.exe Keep this path handy - some editors need the full absolute path instead of just lean-ctx.
Cursor
- Create or edit the MCP config file:
# macOS / Linux: mkdir -p ~/.cursor nano ~/.cursor/mcp.json # Windows (PowerShell): mkdir -Force "$env:USERPROFILE\.cursor" notepad "$env:USERPROFILE\.cursor\mcp.json" - Paste this JSON:
{ "mcpServers": { "lean-ctx": { "command": "lean-ctx" } } } - Restart Cursor completely (Cmd+Q / Alt+F4, then reopen).
- Verify: open the MCP panel in settings - lean-ctx should show as "connected".
Optional: install agent rules to make Cursor prefer lean-ctx tools:
lean-ctx init --agent cursor Claude Code
Claude Code has built-in MCP support. No config file needed:
# Add lean-ctx as MCP server:
claude mcp add lean-ctx lean-ctx
# Install the CLAUDE.md instructions (makes Claude use lean-ctx tools):
lean-ctx init --agent claude The init command creates a CLAUDE.md file that teaches Claude to use lean-ctx tools instead of native file reads and shell commands.
Claude Code: 2048-Character Limit
Claude Code truncates MCP server instructions to 2048 characters. This means lean-ctx's full instruction set (session management, compression protocols, tool preferences) gets cut off.
lean-ctx handles this automatically: lean-ctx init --agent claude installs the full instructions as a Claude Rules file (~/.claude/rules/lean-ctx.md) and an Agent Skill (~/.claude/skills/lean-ctx/). Claude Code loads these without any character cap.
# Automatic (recommended) - run during init:
lean-ctx init --agent claude
# This automatically:
# 1. Registers MCP server in ~/.claude.json
# 2. Installs full instructions to ~/.claude/rules/lean-ctx.md
# 3. Installs Agent Skill to ~/.claude/skills/lean-ctx/ If you installed lean-ctx manually (without init), copy the rules file: lean-ctx init --agent claude to retroactively install rules + skills.
GitHub Copilot (VS Code)
Option A: Copilot CLI (.github/mcp.json)
- Create
.github/mcp.jsonin your project root:{ "mcpServers": { "lean-ctx": { "command": "lean-ctx" } } } - Restart VS Code / Copilot CLI.
Option B: VS Code Copilot (.vscode/mcp.json)
- Create
.vscode/mcp.jsonin your project root:{ "servers": { "lean-ctx": { "type": "stdio", "command": "lean-ctx" } } } - Restart VS Code.
Tip: lean-ctx init --agent copilot creates both files automatically.
Note: GitHub Copilot MCP support requires VS Code 1.99+ and the latest Copilot extension.
Per-project: This config is per-project. Add .github/mcp.json to your .gitignore or commit it to share with your team.
Windsurf
- Open the MCP config file:
# macOS: mkdir -p ~/.codeium/windsurf nano ~/.codeium/windsurf/mcp_config.json # Linux: mkdir -p ~/.codeium/windsurf nano ~/.codeium/windsurf/mcp_config.json # Windows: mkdir -Force "$env:USERPROFILE\.codeium\windsurf" notepad "$env:USERPROFILE\.codeium\windsurf\mcp_config.json" - Paste this JSON:
{ "mcpServers": { "lean-ctx": { "command": "/FULL/PATH/TO/lean-ctx" } } }Important: Windsurf requires the full absolute path to the binary. Example for Homebrew on macOS:
{ "mcpServers": { "lean-ctx": { "command": "/opt/homebrew/bin/lean-ctx" } } } - Restart Windsurf completely.
- Check MCP connection in Windsurf settings.
Zed
- Open Zed settings:
# macOS / Linux: ~/.config/zed/settings.json # Windows: %APPDATA%\Zed\settings.json - Add the lean-ctx context server configuration:
{ "context_servers": { "lean-ctx": { "source": "custom", "command": "lean-ctx", "args": [], "env": {} } } }Note: Zed uses
context_servers(notmcpServers) and requiressource: "custom". - Save and restart Zed.
- Verify: lean-ctx tools should appear in the assistant panel.
Tip: Run lean-ctx init --agent zed to install agent rules that make Zed prefer lean-ctx tools.
OpenAI Codex CLI
- Create or edit the Codex config file:
# macOS / Linux: mkdir -p ~/.codex nano ~/.codex/config.toml # Windows: mkdir -Force "$env:USERPROFILE\.codex" notepad "$env:USERPROFILE\.codex\config.toml" - Add the MCP server entry:
[mcp_servers.lean-ctx] command = "lean-ctx" args = [] - Restart Codex CLI.
Alternative: use the Codex CLI built-in command:
codex mcp add lean-ctx Install agent instructions for Codex:
lean-ctx init --agent codex Gemini CLI
- Create or edit the Gemini MCP config:
# macOS / Linux: mkdir -p ~/.gemini nano ~/.gemini/settings.json # Windows (PowerShell): mkdir -Force "$env:USERPROFILE\.gemini" notepad "$env:USERPROFILE\.gemini\settings.json" - Add the MCP server configuration:
{ "mcpServers": { "lean-ctx": { "command": "lean-ctx" } } }Important: If Gemini can't find the binary, use the full path. Example for Homebrew on macOS:
{ "mcpServers": { "lean-ctx": { "command": "/opt/homebrew/bin/lean-ctx" } } } - Restart Gemini CLI.
- Verify with a prompt that triggers tool use.
Known issue: Gemini CLI may not always invoke MCP tools consistently. If tools aren't triggered, try adding lean-ctx init --agent gemini to install a GEMINI.md with usage instructions.
Antigravity
Antigravity uses the same MCP config format as Gemini CLI but in a separate directory.
- Create the Antigravity MCP config directory and file:
# macOS / Linux: mkdir -p ~/.gemini/antigravity nano ~/.gemini/antigravity/mcp_config.json # Windows (PowerShell): mkdir -Force "$env:USERPROFILE\.gemini\antigravity" notepad "$env:USERPROFILE\.gemini\antigravity\mcp_config.json" - Paste this JSON:
{ "mcpServers": { "lean-ctx": { "command": "lean-ctx" } } }Important: If Antigravity can't find the binary, use the full absolute path to
lean-ctx. - Restart Antigravity.
- Verify with a prompt that triggers tool use.
You can also manage MCP servers via Antigravity's built-in server management UI.
OpenCode
- Create or edit the OpenCode config file:
# macOS / Linux: mkdir -p ~/.config/opencode nano ~/.config/opencode/opencode.json - Add the MCP server configuration:
{ "$schema": "https://opencode.ai/config.json", "mcp": { "lean-ctx": { "type": "local", "command": ["lean-ctx"], "enabled": true } } } - Restart OpenCode.
OpenClaw
OpenClaw supports MCP servers natively. You can add lean-ctx from the settings UI or via CLI.
- Open OpenClaw settings and navigate to the MCP servers section.
- Add a new MCP server with command: lean-ctx
- Restart OpenClaw to activate.
Tip: Run lean-ctx init --agent openclaw to install lean-ctx skills for OpenClaw.
Pi
Pi has a dedicated lean-ctx plugin (pi-lean-ctx) that integrates automatically.
- Make sure lean-ctx is installed:
cargo install lean-ctx - Install the Pi plugin:
pi install npm:pi-lean-ctxOr use lean-ctx's built-in command:
lean-ctx init --agent pi - Restart Pi.
- Verify: lean-ctx tools should appear in Pi's tool list.
Note: Pi's smart reads automatically use lean-ctx when the plugin is installed. No additional configuration needed.
AWS Kiro
يتطلب AWS Kiro كلاً من تكوين MCP وملف توجيه لاستخدام أدوات lean-ctx بدلاً من البدائل الأصلية.
- شغّل
lean-ctx setup- يُكتشف Kiro تلقائياً ويتم إنشاء كل من تكوين MCP وملف التوجيه. - أو استخدم الأمر المخصص (ينشئ كلا الملفين):
lean-ctx init --agent kiro - ينشئ هذا ملفين:
~/.kiro/settings/mcp.json- اتصال خادم MCP.kiro/steering/lean-ctx.md- ملف توجيه يُخبر Kiro بتفضيلmcp_lean_ctx_ctx_readعلىreadFile، إلخ.
- أعد تشغيل Kiro للتفعيل.
مهم: ملف التوجيه (.kiro/steering/lean-ctx.md) خاص بكل مشروع. بدونه، سيستخدم Kiro أدواته الأصلية افتراضياً ويتجاهل خادم MCP للقراءة والبحث.
Verdent
Verdent supports MCP servers via configuration or CLI.
- Open Verdent settings or navigate to the MCP section.
- Run the init command:
lean-ctx init --agent verdent - Or manually create the MCP config:
{ "mcpServers": { "lean-ctx": { "command": "lean-ctx" } } } - Restart Verdent to activate.
Note: Verdent's MCP support may require a recent version. Update Verdent if tools don't appear.
Other MCP-compatible tools
Any tool that supports the Model Context Protocol can use lean-ctx. The standard MCP config format is:
{
"mcpServers": {
"lean-ctx": {
"command": "lean-ctx"
}
}
} Key points:
- The
commandmust point to the lean-ctx binary (use full path if not in PATH). - No
argsorenvare required - lean-ctx auto-configures. - The binary communicates via stdio (standard MCP transport).
- After adding the config, restart your tool and verify lean-ctx tools appear.
مراقبة التوفير
# Terminal dashboard (colors, bars, sparklines)
lean-ctx gain
# Web dashboard with charts
lean-ctx dashboard
# Find uncompressed commands in shell history
lean-ctx discover
# Run a real benchmark on your project
lean-ctx benchmark run خطافات الوكيل وتكامل المحرر
قم بتهيئة lean-ctx لوكيل AI أو محرر الأكواد الخاص بك. يقوم أمر init بإعداد تهيئة MCP، خطافات shell، والتحقق من أن كل شيء يعمل بشكل صحيح. استخدم doctor لتشخيص المشاكل وإصلاحها تلقائياً.
# Configure a specific AI tool (mode auto-detected; override with --mode)
lean-ctx init --agent cursor
lean-ctx init --agent claude
lean-ctx init --agent codex
lean-ctx init --agent gemini
lean-ctx init --agent hermes
lean-ctx init --agent pi # or: pi install npm:pi-lean-ctx
lean-ctx init --agent qoder
lean-ctx init --agent qoderwork
# Force MCP tools if you prefer ctx_* calls
lean-ctx init --agent cursor --mode mcp
# Run diagnostics and auto-fix issues
lean-ctx doctor --fix
# Check current status (MCP, shell, editors)
lean-ctx status --json التحديث
The fastest way to update lean-ctx:
lean-ctx update This auto-detects your installation method, downloads the latest version, and replaces the binary.
You can also update manually using the original installation method:
| Method | Command |
|---|---|
| Homebrew | brew update && brew upgrade lean-ctx |
| Cargo | cargo install lean-ctx |
| npm | npm update -g lean-ctx-bin |
| AUR | yay -Syu lean-ctx |
| Pi | pi install npm:pi-lean-ctx |
| install.sh | Re-run curl -fsSL https://leanctx.com/install.sh | sh |
| Binary | Download latest from GitHub Releases |
After updating, re-run lean-ctx setup to ensure shell hooks and editor configs are up to date:
lean-ctx setup # re-configures shell + editors
source ~/.zshrc # restart shell إلغاء التثبيت
To fully remove lean-ctx from your system:
lean-ctx uninstall This removes:
- Shell hook entries from
~/.zshrc,~/.bashrc,~/.config/fish/config.fish, and PowerShell profiles - Agent rules files (
CLAUDE.md,.cursorrules, etc.) - Cache and config files in
~/.lean-ctx/
Then remove the binary itself:
| Installed via | Remove command |
|---|---|
| Cargo | cargo uninstall lean-ctx |
| Homebrew | brew uninstall lean-ctx |
| AUR | yay -R lean-ctx |
| Pi | pi uninstall npm:pi-lean-ctx |
| Manual binary | rm $(which lean-ctx) |
Restart your terminal and AI coding tool to complete the uninstallation.
قواعد الوكيل
Agent rules tell your AI coding tool to prefer lean-ctx MCP tools (ctx_read, ctx_shell, ctx_search, ctx_tree) over native file reads, shell commands, and search. Run lean-ctx init --agent <name> to install them automatically.
كيف يعمل
lean-ctx init --agent <name>writes a rules/instructions file to the agent's expected location.- The rules file teaches the AI to use
ctx_read,ctx_shell,ctx_search, andctx_treeinstead of native equivalents. - This is optional - lean-ctx works without agent rules, but performance improves when the AI actively uses lean-ctx tools.
- The rules are idempotent - running
initagain updates them without duplicating content.
ما يتم حقنه
القواعد توجه وكيل الذكاء الاصطناعي لاستخدام أدوات lean-ctx MCP بدلاً من البدائل الأصلية:
| Instead of | Use | Why |
|---|---|---|
Read / cat | ctx_read | Cached, 10 read modes, re-reads cost ~13 tokens |
Shell / bash | ctx_shell | Pattern compression for git, npm, cargo, docker output |
Grep / search | ctx_search | Compact, token-efficient results |
ls / find | ctx_tree | Compact directory maps with file counts |
Note: Write, StrReplace, Delete, and Glob have no lean-ctx replacement - the rules tell the AI to keep using native tools for those.
Where are rules installed?
| AI Tool | Rules File |
|---|---|
| Claude Code | ~/.claude/CLAUDE.md |
| Codex CLI | ~/.codex/instructions.md |
| Cursor | ~/.cursor/rules/lean-ctx.mdc |
| Windsurf | ~/.codeium/windsurf/rules/lean-ctx.md |
| Gemini CLI | ~/.gemini/GEMINI.md |
| VS Code / Copilot | .github/copilot-instructions.md |
| Zed | ~/.config/zed/rules/lean-ctx.md |
| OpenCode | ~/.config/opencode/rules/lean-ctx.md |
| Pi | pi-lean-ctx package (built-in via pi.dev) |
| Qoder | ~/.qoder/settings.json (PreToolUse hooks) |
Manual rules (fallback)
If your AI tool doesn't support agent rules, you can paste these instructions into the system prompt or a rules file:
# lean-ctx - Context Engineering Layer
Default (CLI-first):
- Use lean-ctx CLI commands for reads/search:
- lean-ctx read <path> [-m mode]
- lean-ctx grep <pattern> <path>
- lean-ctx ls <path>
- Use lean-ctx -c "<cmd>" for compressed shell output
If you're running in MCP/Hybrid mode:
- Prefer ctx_* tools instead of native equivalents:
- ctx_read instead of Read/cat for file reads
- ctx_shell instead of Shell/bash for commands
- ctx_search instead of Grep/search for code search
- ctx_tree instead of ls/find for directory listing
Keep using native tools for: Write, StrReplace, Delete, Glob (no lean-ctx replacement).
Do NOT fall back to native tools for reading/shell/search/tree when a lean-ctx path is available. Cursor users: Save this as .cursor/rules/lean-ctx.mdc in your project root for project-level rules, or ~/.cursor/rules/lean-ctx.mdc for global rules.
استكشاف الأخطاء
lean-ctx: command not found
البرنامج ليس في PATH. تحقق من مكان تثبيته:
# Cargo installs here:
ls ~/.cargo/bin/lean-ctx
# Homebrew installs here (macOS):
ls /opt/homebrew/bin/lean-ctx
# Manual installs - wherever you placed it:
ls /usr/local/bin/lean-ctx أضف المجلد الصحيح إلى PATH في ملف الشل.
AI isn't using lean-ctx tools
If the AI keeps using native Read/Shell instead of ctx_read/ctx_shell:
- Run
lean-ctx init --agent <name>to install agent rules - Restart the AI tool completely (not just reload)
- Check that the MCP server is connected (look for lean-ctx in the tool's MCP panel)
- Try prompting explicitly: "use ctx_read to read this file"
No lean-ctx tools appearing
- Verify the binary path is correct in your MCP config
- Run
lean-ctx doctorto check for issues - Check the AI tool's MCP logs for connection errors
- Try using the full absolute path to
lean-ctxin the config
Shell hook causes hangs
If commands hang after installing the shell hook, set the LEAN_CTX_ACTIVE environment variable to bypass compression for specific commands:
LEAN_CTX_ACTIVE=1 cargo test This disables lean-ctx compression for that command. Useful for CI/CD scripts and long-running processes.
macOS Gatekeeper blocks lean-ctx
xattr -d com.apple.quarantine $(which lean-ctx) Windows: PowerShell vs cmd.exe
lean-ctx prioritizes PowerShell on Windows. If you experience issues with cmd.exe, set the LEAN_CTX_SHELL environment variable to force PowerShell: $env:LEAN_CTX_SHELL = "powershell"
الخطوات التالية
- Project Intelligence Graph - build and query your project's dependency graph
- CEP Protocol - cognitive efficiency scoring for leaner AI responses
- CCP Protocol - cross-session context continuity
- TDD Protocol - token-dense dialect for maximum compression
- All 58 Tools Reference
- CLI Reference