You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jul 29, 2025. It is now read-only.
Copy file name to clipboardExpand all lines: README.md
+30Lines changed: 30 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -100,6 +100,7 @@ You can configure OpenCode using environment variables:
100
100
|`AZURE_OPENAI_ENDPOINT`| For Azure OpenAI models |
101
101
|`AZURE_OPENAI_API_KEY`| For Azure OpenAI models (optional when using Entra ID) |
102
102
|`AZURE_OPENAI_API_VERSION`| For Azure OpenAI models |
103
+
|`LOCAL_ENDPOINT`| For self-hosted models |
103
104
|`SHELL`| Default shell to use (if not specified in config) |
104
105
105
106
### Shell Configuration
@@ -566,6 +567,35 @@ The AI assistant can access LSP features through the `diagnostics` tool, allowin
566
567
567
568
While the LSP client implementation supports the full LSP protocol (including completions, hover, definition, etc.), currently only diagnostics are exposed to the AI assistant.
568
569
570
+
## Using a self-hosted model provider
571
+
572
+
OpenCode can also load and use models from a self-hosted (OpenAI-like) provider.
573
+
This is useful for developers who want to experiment with custom models.
574
+
575
+
### Configuring a self-hosted provider
576
+
577
+
You can use a self-hosted model by setting the `LOCAL_ENDPOINT` environment variable.
578
+
This will cause OpenCode to load and use the models from the specified endpoint.
579
+
580
+
```bash
581
+
LOCAL_ENDPOINT=http://localhost:1235/v1
582
+
```
583
+
584
+
### Configuring a self-hosted model
585
+
586
+
You can also configure a self-hosted model in the configuration file under the `agents` section:
0 commit comments