Star 历史趋势
数据来源: GitHub API · 生成自 Stargazers.cn
README.md

🐳 DeepSeek TUI

This terminal-native coding agent is built around DeepSeek V4's 1M-token context window and prefix cache capability. It is distributed as a single binary and requires no Node.js or Python runtime. It also includes an MCP client, a sandbox, and a durable task queue out of the box.

简体中文 README

Install

deepseek ships as a self-contained Rust binary — no Node.js or Python runtime is required to run it. Pick whichever path you already have on your machine; they all land the same binary on your PATH.

# 1. npm — easiest if you already use Node. The npm package is a thin # installer that downloads the matching prebuilt binary from GitHub # Releases; it does NOT add a Node runtime dependency to deepseek itself. npm install -g deepseek-tui # 2. Cargo — no Node needed. cargo install deepseek-tui-cli --locked # `deepseek` (entry point) cargo install deepseek-tui --locked # `deepseek-tui` (TUI binary) # 3. Direct download — no Node, no toolchain. # https://github.com/Hmbown/DeepSeek-TUI/releases # Prebuilt for Linux x64/ARM64, macOS x64/ARM64, Windows x64.

In mainland China, speed up the npm path with --registry=https://registry.npmmirror.com, or use the Cargo mirror below.

CI npm crates.io DeepWiki

Buy me a coffee

DeepSeek TUI screenshot


What Is It?

DeepSeek TUI is a coding agent that runs entirely in your terminal. It gives DeepSeek's frontier models direct access to your workspace — reading and editing files, running shell commands, searching the web, managing git, and orchestrating sub-agents — all through a fast, keyboard-driven TUI.

Built for DeepSeek V4 (deepseek-v4-pro / deepseek-v4-flash) with 1M-token context window and native thinking-mode (chain-of-thought) streaming.

Key Features

  • Native RLM (rlm_query) — fans out 1–16 cheap deepseek-v4-flash children in parallel for batched analysis and parallel reasoning, all against the existing API client
  • Thinking-mode streaming — watch the model's chain-of-thought unfold in real time as it works through your tasks
  • Full tool suite — file ops, shell execution, git, web search/browse, apply-patch, sub-agents, MCP servers
  • 1M-token context — automatic intelligent compaction when context fills up; prefix-cache aware for cost efficiency
  • Three modes — Plan (read-only explore), Agent (interactive with approval), YOLO (auto-approved)
  • Reasoning-effort tiers — cycle through off → high → max with Shift + Tab
  • Session save/resume — checkpoint and resume long-running sessions
  • Workspace rollback — side-git pre/post-turn snapshots with /restore and revert_turn, without touching your repo's .git
  • Durable task queue — background tasks survive restarts; think scheduled automation, long-running reviews
  • HTTP/SSE runtime APIdeepseek serve --http for headless agent workflows
  • MCP protocol — connect to Model Context Protocol servers for extended tooling; please see docs/MCP.md
  • LSP diagnostics — inline error/warning surfacing after every edit via rust-analyzer, pyright, typescript-language-server, gopls, clangd
  • User memory — optional persistent note file injected into the system prompt for cross-session preferences
  • Localized UIen, ja, zh-Hans, pt-BR with auto-detection
  • Live cost tracking — per-turn and session-level token usage and cost estimates; cache hit/miss breakdown
  • Skills system — composable, installable instruction packs from GitHub with no backend service required

How It's Wired

deepseek (dispatcher CLI) → deepseek-tui (companion binary) → ratatui interface ↔ async engine ↔ OpenAI-compatible streaming client. Tool calls route through a typed registry (shell, file ops, git, web, sub-agents, MCP, RLM) and results stream back into the transcript. The engine manages session state, turn tracking, the durable task queue, and an LSP subsystem that feeds post-edit diagnostics into the model's context before the next reasoning step.

See docs/ARCHITECTURE.md for the full walkthrough.


Quickstart

npm install -g deepseek-tui deepseek --version deepseek

Prebuilt binaries are published for Linux x64, Linux ARM64 (v0.8.8+), macOS x64, macOS ARM64, and Windows x64. For other targets (musl, riscv64, FreeBSD, etc.), see Install from source or docs/INSTALL.md.

On first launch you'll be prompted for your DeepSeek API key. The key is saved to ~/.deepseek/config.toml so it works from any directory without OS credential prompts.

You can also set it ahead of time:

deepseek auth set --provider deepseek # saves to ~/.deepseek/config.toml export DEEPSEEK_API_KEY="YOUR_KEY" # env var alternative; use ~/.zshenv for non-interactive shells deepseek deepseek doctor # verify setup

To rotate or remove a saved key: deepseek auth clear --provider deepseek.

Linux ARM64 (Raspberry Pi, Asahi, Graviton, HarmonyOS PC)

npm i -g deepseek-tui works on glibc-based ARM64 Linux from v0.8.8 onward. You can also download prebuilt binaries from the Releases page and place them side by side on your PATH.

China / Mirror-friendly Installation

If GitHub or npm downloads are slow from mainland China, use a Cargo registry mirror:

# ~/.cargo/config.toml [source.crates-io] replace-with = "tuna" [source.tuna] registry = "sparse+https://mirrors.tuna.tsinghua.edu.cn/crates.io-index/"

Then install both binaries (the dispatcher delegates to the TUI at runtime):

cargo install deepseek-tui-cli --locked # provides `deepseek` cargo install deepseek-tui --locked # provides `deepseek-tui` deepseek --version

Prebuilt binaries can also be downloaded from GitHub Releases. Use DEEPSEEK_TUI_RELEASE_BASE_URL for mirrored release assets.

Windows (Scoop)

Scoop is a Windows package manager. Once installed, run:

scoop install deepseek-tui
Install from source

Works on any Tier-1 Rust target — including musl, riscv64, FreeBSD, and older ARM64 distros.

# Linux build deps (Debian/Ubuntu/RHEL): # sudo apt-get install -y build-essential pkg-config libdbus-1-dev # sudo dnf install -y gcc make pkgconf-pkg-config dbus-devel git clone https://github.com/Hmbown/DeepSeek-TUI.git cd DeepSeek-TUI cargo install --path crates/cli --locked # requires Rust 1.85+; provides `deepseek` cargo install --path crates/tui --locked # provides `deepseek-tui`

Both binaries are required. Cross-compilation and platform-specific notes: docs/INSTALL.md.

Other API Providers

# NVIDIA NIM deepseek auth set --provider nvidia-nim --api-key "YOUR_NVIDIA_API_KEY" deepseek --provider nvidia-nim # Fireworks deepseek auth set --provider fireworks --api-key "YOUR_FIREWORKS_API_KEY" deepseek --provider fireworks --model deepseek-v4-pro # Self-hosted SGLang SGLANG_BASE_URL="http://localhost:30000/v1" deepseek --provider sglang --model deepseek-v4-flash

What's New In v0.8.12

A feature release with 20 community PRs on top of the v0.8.11 cache-maxing foundation. Full changelog.

  • Reasoning-effort auto modereasoning_effort = "auto" picks the right tier from the prompt: debug/error → Max, search/lookup → Low, default → High
  • Bash arity dictionaryauto_allow = ["git status"] matches git status -s but not git push. Knows git, cargo, npm, docker, kubectl, and more
  • Vim modal editing — normal/insert mode in the composer with standard Vim keybindings
  • Skill registry sync/skills sync fetches and installs/updates the community registry
  • FIM edit tool — surgical code edits via DeepSeek's /beta fill-in-the-middle endpoint
  • Large-tool-output routing — outsized tool results get truncated previews with spillover, protecting parent context
  • Pluggable sandbox backendsexec_shell can route to Alibaba OpenSandbox or other remote backends
  • Layered permission rulesets — builtin/agent/user priority layers for execpolicy deny/allow rules
  • Cache-aware resident sub-agents — file content prepended for V4 prefix-cache locality; global lease table
  • Unified slash-command namespace — user commands with $1/$2/$ARGUMENTS templates
  • Color::Reset migration — all hardcoded backgrounds replaced with Color::Reset for light-terminal support
  • New docs: SECURITY.md (#648), CODE_OF_CONDUCT.md (#686), zh-Hans locale activation (#652)

28 community PRs by @merchloubna70-dot. First-time contributor @zichen0116 (#686).


Usage

deepseek # interactive TUI deepseek "explain this function" # one-shot prompt deepseek --model deepseek-v4-flash "summarize" # model override deepseek --yolo # auto-approve tools deepseek auth set --provider deepseek # save API key deepseek doctor # check setup & connectivity deepseek doctor --json # machine-readable diagnostics deepseek setup --status # read-only setup status deepseek setup --tools --plugins # scaffold tool/plugin dirs deepseek models # list live API models deepseek sessions # list saved sessions deepseek resume --last # resume the most recent session deepseek resume <SESSION_ID> # resume a specific session by UUID deepseek fork <SESSION_ID> # fork a session at a chosen turn deepseek serve --http # HTTP/SSE API server deepseek pr <N> # fetch PR and pre-seed review prompt deepseek mcp list # list configured MCP servers deepseek mcp validate # validate MCP config/connectivity deepseek mcp-server # run dispatcher MCP stdio server

Keyboard Shortcuts

KeyAction
TabComplete / or @ entries; while running, queue draft as follow-up; otherwise cycle mode
Shift+TabCycle reasoning-effort: off → high → max
F1Searchable help overlay
EscBack / dismiss
Ctrl+KCommand palette
Ctrl+RResume an earlier session
Alt+RSearch prompt history and recover cleared drafts
Ctrl+SStash current draft (/stash list, /stash pop to recover)
@pathAttach file/directory context in composer
(at composer start)Select attachment row for removal
Alt+↑Edit last queued message

Full shortcut catalog: docs/KEYBINDINGS.md.


Modes

ModeBehavior
Plan 🔍Read-only investigation — model explores and proposes a plan (update_plan + checklist_write) before making changes
Agent 🤖Default interactive mode — multi-step tool use with approval gates; model outlines work via checklist_write
YOLOAuto-approve all tools in a trusted workspace; still maintains plan and checklist for visibility

Configuration

User config: ~/.deepseek/config.toml. Project overlay: <workspace>/.deepseek/config.toml (denied: api_key, base_url, provider, mcp_config_path). config.example.toml has every option.

Key environment variables:

VariablePurpose
DEEPSEEK_API_KEYAPI key
DEEPSEEK_BASE_URLAPI base URL
DEEPSEEK_MODELDefault model
DEEPSEEK_PROVIDERdeepseek (default), nvidia-nim, fireworks, sglang
DEEPSEEK_PROFILEConfig profile name
DEEPSEEK_MEMORYSet to on to enable user memory
NVIDIA_API_KEY / FIREWORKS_API_KEY / SGLANG_API_KEYProvider auth
SGLANG_BASE_URLSelf-hosted SGLang endpoint
NO_ANIMATIONS=1Force accessibility mode at startup
SSL_CERT_FILECustom CA bundle for corporate proxies

UI locale is separate from model language — set locale in settings.toml, use /config locale zh-Hans, or rely on LC_ALL/LANG. See docs/CONFIGURATION.md and docs/MCP.md.


Models & Pricing

ModelContextInput (cache hit)Input (cache miss)Output
deepseek-v4-pro1M$0.003625 / 1M*$0.435 / 1M*$0.87 / 1M*
deepseek-v4-flash1M$0.0028 / 1M$0.14 / 1M$0.28 / 1M

Legacy aliases deepseek-chat / deepseek-reasoner map to deepseek-v4-flash. NVIDIA NIM variants use your NVIDIA account terms.

DeepSeek Pro rates currently reflect a limited-time 75% discount, which remains valid until 15:59 UTC on 31 May 2026. After that time, the TUI cost estimator will revert to the base Pro rates.


Publishing Your Own Skill

DeepSeek TUI discovers skills from workspace directories (.agents/skillsskills.opencode/skills.claude/skills) and the global ~/.deepseek/skills. Each skill is a directory with a SKILL.md file:

~/.deepseek/skills/my-skill/ └── SKILL.md

Frontmatter required:

--- name: my-skill description: Use this when DeepSeek should follow my custom workflow. --- # My Skill Instructions for the agent go here.

Commands: /skills (list), /skill <name> (activate), /skill new (scaffold), /skill install github:<owner>/<repo> (community), /skill update / uninstall / trust. Community installs from GitHub require no backend service. Installed skills appear in the model-visible session context; the agent can auto-select relevant skills via the load_skill tool when your task matches their descriptions.


Documentation

DocTopic
ARCHITECTURE.mdCodebase internals
CONFIGURATION.mdFull config reference
MODES.mdPlan / Agent / YOLO modes
MCP.mdModel Context Protocol integration
RUNTIME_API.mdHTTP/SSE API server
INSTALL.mdPlatform-specific install guide
MEMORY.mdUser memory feature guide
SUBAGENTS.mdSub-agent role taxonomy and lifecycle
KEYBINDINGS.mdFull shortcut catalog
RELEASE_RUNBOOK.mdRelease process
OPERATIONS_RUNBOOK.mdOps & recovery

Full Changelog: CHANGELOG.md.


Thanks

This project ships with help from a growing community of contributors:

  • merchloubna70-dot — 28 PRs spanning features, fixes, and VS Code extension scaffolding (#645–#681)
  • WyxBUPT-22 — Markdown rendering for tables, bold/italic, and horizontal rules (#579)
  • loongmiaow-pixel — Windows + China install documentation (#578)
  • 20bytes — User memory docs and help polish (#569)
  • staryxchen — glibc compatibility preflight (#556)
  • Vishnu1837 — glibc compatibility improvements (#565)
  • shentoumengxin — Shell cwd boundary validation (#524)
  • toi500 — Windows paste fix report
  • xsstomy — Terminal startup repaint report
  • melody0709 — Slash-prefix Enter activation report
  • lloydzhou and jeoor — Compaction cost reports
  • Agent-Skill-007 — README clarity pass (#685)
  • woyxiang — Windows Scoop install docs (#696)
  • wangfeng — Pricing/discount info update (#692)
  • zichen0116 — CODE_OF_CONDUCT.md (#686)
  • Hafeez Pizofreude — SSRF protection in fetch_url and Star History chart
  • Unic (YuniqueUnic) — Schema-driven config UI (TUI + web)
  • Jason — SSRF security hardening

Contributing

See CONTRIBUTING.md. Pull requests welcome — check the open issues for good first contributions.

[!Note] Not affiliated with DeepSeek Inc.

License

MIT

Star History

Star History Chart

关于 About

Coding agent for DeepSeek models that runs in your terminal
clideepseekllmrustterminaltui

语言 Languages

Rust99.3%
JavaScript0.4%
Shell0.2%
Dockerfile0.0%

提交活跃度 Commit Activity

代码提交热力图
过去 52 周的开发活跃度
573
Total Commits
峰值: 317次/周
Less
More

核心贡献者 Contributors