Skip to content

Auto-discover models from OpenAI-compatible provider endpoints #6231

@ochsec

Description

@ochsec

Description

For OpenAI-compatible local providers (LM Studio, Ollama, llama.cpp, etc.), users currently need to manually list all available models in their opencode.json configuration. This is tedious and error-prone, especially for local providers where models change frequently as they are loaded/unloaded.

Most OpenAI-compatible providers expose a standard /v1/models endpoint that lists all available models. OpenCode should automatically query this endpoint and discover available models, eliminating the need for manual configuration.

Proposed Solution

When a provider uses @ai-sdk/openai-compatible, OpenCode should:

  1. Make a GET request to {baseURL}/models (e.g., http://localhost:1234/v1/models)
  2. Parse the response to extract available models
  3. Automatically populate the model list, merging with any manually configured models
  4. Optionally refresh this list periodically or on-demand

Benefits

  • Better UX: No manual model configuration needed
  • Dynamic updates: Automatically reflects models as they're loaded/unloaded in local servers
  • Fewer errors: Eliminates misconfiguration issues
  • Consistency: Works the same way across all OpenAI-compatible providers

Current Workaround

Users can:

  1. Manually query http://localhost:1234/v1/models to see available models
  2. Manually add each model to opencode.json
  3. Use the community plugin opencode-lmstudio for LM Studio

Example: Current Manual Configuration

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "lmstudio": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "LM Studio (local)",
      "options": {
        "baseURL": "http://127.0.0.1:1234/v1"
      },
      "models": {
        "google/gemma-3n-e4b": {
          "name": "Gemma 3n-e4b (local)"
        },
        "qwen/qwen3-coder-30b": {
          "name": "Qwen 3 Coder 30B"
        }
      }
    }
  }
}

Example: Proposed Auto-Discovery

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "lmstudio": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "LM Studio (local)",
      "options": {
        "baseURL": "http://127.0.0.1:1234/v1"
      }
      // models automatically discovered!
    }
  }
}

Related

OpenCode Version

1.0.58+

Additional Context

The opencode-lmstudio plugin already implements this functionality successfully for LM Studio. Consider integrating similar logic into core OpenCode for all @ai-sdk/openai-compatible providers.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions