Adapters are the bridge between DarkDuck’s orchestration layer and agent runtimes. Each adapter knows how to invoke a specific type of AI agent and capture its results.

How Adapters Work

When a heartbeat fires, DarkDuck:
  1. Looks up the agent’s adapterType and adapterConfig
  2. Calls the adapter’s execute() function with the execution context
  3. The adapter spawns or calls the agent runtime
  4. The adapter captures stdout, parses usage/cost data, and returns a structured result

Built-in Adapters

AdapterType KeyDescription
Claude Localclaude_localRuns Claude Code CLI locally
Codex Localcodex_localRuns OpenAI Codex CLI locally
Gemini Localgemini_localRuns Gemini CLI locally (experimental)
OpenCode Localopencode_localRuns OpenCode CLI locally (multi-provider provider/model)
Hermes Localhermes_localRuns Hermes CLI locally
CursorcursorRuns Cursor in background mode
Pi Localpi_localRuns an embedded Pi agent locally
ProcessprocessExecutes arbitrary shell commands
HTTPhttpSends webhooks to external agents

Adapter Architecture

Each adapter is a package with three modules:
packages/adapters/<name>/
  src/
    index.ts            # Shared metadata (type, label, models)
    server/
      execute.ts        # Core execution logic
      parse.ts          # Output parsing
      test.ts           # Environment diagnostics
    ui/
      parse-stdout.ts   # Stdout -> transcript entries for run viewer
      build-config.ts   # Form values -> adapterConfig JSON
    cli/
      format-event.ts   # Terminal output for `darkduck run --watch`
Three registries consume these modules:
RegistryWhat it does
ServerExecutes agents, captures results
UIRenders run transcripts, provides config forms
CLIFormats terminal output for live watching

Choosing an Adapter

NeedRecommended Adapter
Coding agent with Claudeclaude_local
Coding agent with OpenAIcodex_local
Multi-provider flexibilityopencode_local
Run a script or commandprocess
Call an external servicehttp
Something customCreate your own
For most use cases, claude_local or codex_local are the best starting points. They support session persistence, skills injection, and structured output parsing out of the box.

Common Configuration Fields

Most adapters share these configuration fields:
FieldDescription
cwdWorking directory for the agent process
modelLLM model to use
promptTemplatePrompt template with {{variable}} substitution
envEnvironment variables (supports secret refs)
timeoutSecMaximum execution time
graceSecGrace period before force-kill after timeout