Local development

Build and run AI-assisted development on your own machine. This page highlights key tools: pi.dev, Zed, and others that keep your code and data local.

What we mean by local development

Local development here means doing your coding and AI-assisted work on your own hardware—terminal harnesses, editors, and model runtimes—so that prompts, code, and data don’t have to go through a vendor’s cloud unless you choose to. Tools like pi and Zed can use local models (e.g. via Ollama) so you get AI assistance without sending your codebase or conversations to a third party.

pi.dev Terminal coding harness

Pi is a minimal terminal-based coding agent: “there are many coding agents, but this one is mine.” You adapt it with TypeScript extensions, skills, prompt templates, and themes. It supports 15+ providers (Anthropic, OpenAI, Google, Ollama, OpenRouter, etc.), tree-structured sessions, AGENTS.md/SYSTEM.md context, and pi packages via npm or git. No baked-in sub-agents or plan mode—you extend it the way you want. Great for local-first workflows when you point it at Ollama or other self-hosted endpoints.

  • • 15+ providers, hundreds of models; switch with /model or Ctrl+L
  • • Tree sessions; AGENTS.md, SYSTEM.md, compaction, skills, prompt templates
  • • Extensions and pi packages; interactive, print/JSON, RPC, SDK modes

Install (npm):

npm install -g @mariozechner/pi-coding-agent

pi.dev → | GitHub

Zed Editor

Zed is a high-performance code editor with collaboration built in. It works with the AI Dev Suite and OpenCode via the Agent Client Protocol (ACP): the editor launches the agent as a subprocess and talks to your local API. All inference can stay on your machine (Ollama, memory, knowledge bases). You get AI-assisted editing and chat without sending your code to a cloud—configure agent_servers in Zed to point at your local stack.

  • • Fast, native editor; real-time collaboration
  • • ACP integration for local AI (AI Dev Suite, OpenCode)
  • • One-click install from the AI Dev Suite; or install standalone below

Install (curl):

curl -fsSL https://zed.dev/install.sh | sh

zed.dev →

Other local dev tools

A typical local development stack can include:

  • OpenCode — AI coding agent; works with Zed and the AI Dev Suite via ACP. Install: curl -fsSL https://opencode.ai/install | bash
  • Ollama — Run Llama and other models locally. Install: curl -fsSL https://ollama.com/install.sh | sh. Ollama page →
  • LM Studio — Local LLM runtime and UI. Install: curl -fsSL https://lmstudio.ai/install.sh | bash
  • AI Dev Suite — Chat, memory, drive, RAG, one-click install for Zed, OpenCode, Ollama, LM Studio. AI Dev Suite →

Full list of install commands: /install/

How this fits the Local AI stack

Using pi, Zed, OpenCode, and Ollama (or LM Studio) on your machine means your development workflow can be fully local: prompts and code stay on your hardware. That aligns with the same data-sovereignty and cost benefits as the rest of the Local AI offer. We can help you wire these tools together—Ollama on a Mac Mini, Zed and pi on your laptop, or a full AI Dev Suite setup—so you get a coherent local development environment.

Next steps

Want a local development setup with pi, Zed, and local models? Check Install for commands and AI Dev Suite for the full stack—or get in touch to scope a tailored setup.

View install commands