From d2f0f020f8ce1ccb190b0030e8b1acf31f9143cb Mon Sep 17 00:00:00 2001 From: Giray Pultar Date: Sun, 18 Jan 2026 16:02:55 +0100 Subject: [PATCH] Document OpenAI API provider usage Added instructions for using OpenAI compatible API provider, including required parameters and a JSON example. --- docs/docs/waveai-modes.mdx | 27 +++++++++++++++++++++++++++ 1 file changed, 27 insertions(+) diff --git a/docs/docs/waveai-modes.mdx b/docs/docs/waveai-modes.mdx index 437a6ba99d..ff3de517f9 100644 --- a/docs/docs/waveai-modes.mdx +++ b/docs/docs/waveai-modes.mdx @@ -197,6 +197,33 @@ For newer models like GPT-4.1 or GPT-5, the API type is automatically determined } ``` +### OpenAI Compatible + +To use an OpenAPI compatible API provider, you need to provide the ai:endpoint, ai:apitoken, ai:model parameters, +and use "openai-chat" as the ai:mode. + +:::note +The ai:endpoint is *NOT* a baseurl. The endpoint should contain the full endpoint, not just the baseurl. +For example: https://api.x.ai/v1/chat/completions + +If you provide only the baseurl, you are likely to get a 404 message. +::: + +```json +{ + "xai-grokfast": { + "display:name": "xAI Grok Fast", + "display:order": 2, + "display:icon": "server", + "ai:apitype": "openai-chat", + "ai:model": "x-ai/grok-4-fast", + "ai:endpoint": "https://api.x.ai/v1/chat/completions", + "ai:apitoken": "" + } +} +``` + + ### OpenRouter [OpenRouter](https://openrouter.ai) provides access to multiple AI models. Using the `openrouter` provider simplifies configuration: