Skip to content
DevDuck

DevDuck

Extreme minimalist self-adapting AI agent.

One file. Self-healing. Builds itself as it runs.

PyPI


See It In Action


What Is DevDuck?

An AI agent that hot-reloads its own code, fixes itself when things break, and expands capabilities at runtime. Terminal, browser, cloud — or all at once.

pipx install devduck && devduck
graph LR
    A["🗣️ You"] --> B{"🦆 DevDuck"}
    B -->|CLI / TUI| C["💻 Terminal"]
    B -->|WebSocket| D["🌐 Browser"]
    B -->|AgentCore| E["☁️ Cloud"]
    B -->|Zenoh P2P| F["🔗 Mesh"]
    B -->|MCP| G["🤖 Claude Desktop"]

Features

  • 🔄 Hot Reload

    Edit source code while the agent runs. Changes apply instantly without restart. Protected during execution.

    Learn more

  • 🛠️ 60+ Tools

    Shell, GitHub, browser control, speech, scheduler, ML, messaging. Load more at runtime from any Python package.

    Learn more

  • 🤖 14 Model Providers

    Bedrock, Anthropic, OpenAI, Gemini, Ollama, and 9 more. Auto-detects credentials and picks the best available.

    Learn more

  • 🌙 Ambient Mode

    Background thinking while you're idle. Standard mode explores topics; autonomous mode builds entire features.

    Learn more

  • 🔌 Zenoh P2P

    Multiple DevDuck instances auto-discover each other. Broadcast commands to all or send to specific peers.

    Learn more

  • 🎬 Session Recording

    Record complete sessions with three-layer capture. Resume from any snapshot with full conversation state.

    Learn more

  • 🔗 MCP Integration

    Expose as MCP server for Claude Desktop, or load external MCP servers to extend capabilities.

    Learn more

  • ☁️ AgentCore Deploy

    Deploy to Amazon Bedrock AgentCore with one command. Unified mesh connects CLI + browser + cloud agents.

    Learn more


Quick Start

devduck                              # interactive REPL
devduck --tui                        # multi-conversation TUI
devduck "create a REST API"          # one-shot
devduck --record                     # record session for replay
devduck --resume session.zip         # resume from snapshot
devduck deploy --launch              # ship to AgentCore
import devduck
devduck("analyze this code")

Installation | Quickstart


Model Detection

Set your key. DevDuck figures out the rest.

export ANTHROPIC_API_KEY=sk-ant-...   # → uses Anthropic
export OPENAI_API_KEY=sk-...          # → uses OpenAI
export GOOGLE_API_KEY=...             # → uses Gemini
# or just have AWS credentials        # → uses Bedrock
# or nothing at all                   # → uses Ollama

Priority: Bedrock → Anthropic → OpenAI → GitHub → Gemini → Cohere → Writer → Mistral → LiteLLM → LlamaAPI → MLX → Ollama

All 14 providers


Access Methods

Protocol Port Description
CLI/REPL devduck interactive mode
TUI devduck --tui multi-conversation UI
MCP stdio devduck --mcp for Claude Desktop
Mesh Relay 10000 Browser + AgentCore agents
WebSocket 10001 Per-message streaming
TCP 10002 Raw socket (opt-in)
MCP HTTP 10003 Model Context Protocol (opt-in)
Zenoh P2P multicast Auto-discovery across networks

Servers guide



Resources