Zed Editor AI Setup: Configure Custom LLM Providers and External Agents

Zed Editor AI Setup: Configure Custom LLM Providers and External Agents

Zed Editor with OfoxAI configured

What This Guide Covers

Zed is a high-performance code editor built in Rust with native AI features. It offers one of the most complete AI coding experiences available: a built-in agent, support for external agents like Claude Code and Codex CLI, and custom LLM provider configuration via the OpenAI-compatible protocol.

This guide walks through configuring Zed to use OfoxAI as a custom LLM provider, giving you access to GPT, Claude, Gemini, Qwen, Kimi, and other models through a single API endpoint.

For the official integration reference, see the OfoxAI Zed integration docs.

Why Use a Custom LLM Provider in Zed

Paired with OfoxAI, one API Key gives you access to 100+ mainstream LLMs in Zed. Benefits include:

  • One API key for all models — Access OpenAI, Anthropic, Google, Alibaba, Moonshot, and others without managing separate accounts
  • Unified billing and usage tracking — One dashboard for everything
  • Model flexibility — Add new models by editing your configuration, no waiting for Zed to add native support

Prerequisites

Go to the OfoxAI Console to create an API Key.

Two Ways to Add a Provider

Add visually through the Agent Panel without editing JSON:

  1. Press Cmd+Shift+A to open the Agent Panel
  2. Click + Add Provider in the LLM Providers area
  3. Fill in the configuration:

Zed Agent Panel — add provider form

LabelFieldValue
1Provider NameOfoxAI
2API URLhttps://api.ofox.ai/v1
3API KeyYour OfoxAI API Key
4Model Nameopenai/gpt-5.4-mini
5Max Completion Tokens512000
6CapabilitiesCheck the capabilities supported by the model

Click + Add Model to add more models.

Method 2: settings.json

Batch-configure all models via the settings file. This is ideal for sharing configurations across machines or managing multiple models.

  1. Press Cmd+, to open settings, click Edit in settings.json in the top right

Zed settings — Edit in settings.json 2. Add the following configuration:

{
  "language_models": {
    "openai_compatible": {
      "OfoxAI": {
        "api_url": "https://api.ofox.ai/v1",
        "available_models": [
          {
            "name": "openai/gpt-5.3-codex",
            "display_name": "GPT-5.3 Codex",
            "max_tokens": 512000,
            "max_output_tokens": 65536,
            "capabilities": {
              "tools": true,
              "images": true
            }
          },
          {
            "name": "openai/gpt-5-mini",
            "display_name": "GPT-5 Mini",
            "max_tokens": 256000,
            "max_output_tokens": 32768,
            "capabilities": {
              "tools": true,
              "images": true
            }
          },
          {
            "name": "moonshotai/kimi-k2.5",
            "display_name": "Kimi K2.5",
            "max_tokens": 262144,
            "max_output_tokens": 262144,
            "capabilities": {
              "tools": true,
              "images": true
            }
          },
          {
            "name": "bailian/qwen3-max",
            "display_name": "Qwen3 Max",
            "max_tokens": 256000,
            "max_output_tokens": 64000,
            "capabilities": {
              "tools": true,
              "images": false
            }
          }
        ]
      }
    }
  }
}
  1. After saving, Zed will prompt you to enter your API Key

The API Key is securely stored in the system keychain (macOS Keychain / Linux Secret Service) and is never written in plaintext to config files.

Getting Started with the Agent Panel

Zed’s Agent Panel has two modes:

ModeDescription
Zed AgentBuilt-in AI assistant using your configured LLM Provider (e.g., OfoxAI)
External AgentsExternal agents (Claude Code, Codex CLI, Gemini CLI) that run independently

To start a conversation with Zed Agent:

Zed Agent Panel — model selection

  1. Click + then select Zed Agent (Cmd+N)
  2. In the bottom-right model selector, look under the OfoxAI group and select your model
  3. Type your prompt and send

Zed Agent Panel — chat interface

External agents appear in the External Agents area and have their own credentials, model access, and billing — separate from your configured LLM provider.

ScenarioModelDescription
Agent codingopenai/gpt-5.4-mini512K context, strongest overall
Quick Q&Aopenai/gpt-5-mini256K context, fast and low cost
Long outputmoonshotai/kimi-k2.5262K context / 262K output
Chinese scenariosbailian/qwen3-max256K context, bilingual Chinese/English

See the Model Catalog for the full model list.

Adding More Models

Each model entry requires the following parameters:

FieldRequiredDescription
nameYesModel ID, e.g., openai/gpt-5.4-mini
display_nameNoUI display name
max_tokensYesContext window size
max_output_tokensNoMaximum output token count
capabilities.toolsNoWhether Function Calling is supported
capabilities.imagesNoWhether image input is supported

Add or remove models by editing the available_models array in settings.json.

Pro Tip: Manually filling parameters too slow? Send the Zed integration docs page URL along with https://api.ofox.ai/v1/models to an AI and have it auto-generate the complete settings.json configuration.

Troubleshooting

OfoxAI not showing in model list

Verify settings.json format is correct (watch for missing commas or brackets), then restart Zed.

Authentication error

Open the Command Palette Cmd+Shift+P, search for language model: reset credentials, and re-enter your API Key.

Model does not support tool calling

Set capabilities.tools to false for that model in your configuration. If you leave it as true for a model that does not actually support function calling, requests may fail.

Responses are empty or cut off

Check your max_output_tokens setting. If it is too low, the model may truncate its response. For coding tasks, 16384 or higher is recommended.

Connection timeouts

Confirm that the API URL is exactly https://api.ofox.ai/v1 — no trailing slash, no missing /v1 suffix.

Summary

Zed’s native AI support makes it straightforward to connect a custom LLM provider. Configure OfoxAI through either the Agent Panel GUI or settings.json, add the models you want, and you have a fully functional AI coding assistant running inside a fast, modern editor. For tasks that need specialized agents, Zed’s External Agent support lets you run Claude Code, Codex CLI, or Gemini CLI alongside the built-in agent.

For the full provider documentation and the latest supported models, see the OfoxAI Zed integration docs.