You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
docs(conversation): align runtime model env vars & update component examples
- Add and document runtime env vars: OPENAI_MODEL, AZURE_OPENAI_MODEL, ANTHROPIC_MODEL, GOOGLEAI_MODEL, MISTRAL_MODEL, HUGGINGFACE_MODEL, OLLAMA_MODEL (with defaults)
- Update environment reference page with the new env var names and defaults
- Replace env-var placeholders in component YAML examples with literal default model names for OpenAI, GoogleAI, Anthropic, Mistral, Hugging Face and Ollama
- Add "(configurable via '<ENV>' environment variable)" to each provider's Spec metadata table to show override option
- Add Azure OpenAI usage section to `conversation.openai` doc (how to use `apiType: azure` and `AZURE_OPENAI_MODEL`)
- Clarify AWS Bedrock uses standard AWS auth (no Bedrock-specific env var added)
- Fix small markdown lint issues (trailing newlines/whitespace)
Signed-off-by: Erin La <107987318+giterinhub@users.noreply.github.com>
| `key` | Y | API key for Anthropic. | `"mykey"` |
37
-
| `model` | N | The Anthropic LLM to use. Defaults to `claude-3-5-sonnet-20240620` (configurable via `DAPR_CONVERSATION_ANTHROPIC_MODEL` environment variable). | `${{DAPR_CONVERSATION_ANTHROPIC_MODEL}}` |
37
+
| `model` | N | The Anthropic LLM to use. Defaults to `claude-sonnet-4-20250514` (configurable via the `ANTHROPIC_MODEL` environment variable). | `claude-sonnet-4-20250514` |
38
38
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |
39
39
40
40
## Related links
41
41
42
-
- [Conversation API overview]({{< ref conversation-overview.md >}})
42
+
- [Conversation API overview]({{< ref conversation-overview.md >}})
| `model` | N | The GoogleAI LLM to use. Defaults to `gemini-1.5-flash` (configurable via `DAPR_CONVERSATION_GOOGLEAI_MODEL` environment variable). | `${{DAPR_CONVERSATION_GOOGLEAI_MODEL}}` |
37
+
| `model` | N | The GoogleAI LLM to use. Defaults to `gemini-2.5-flash-lite` (configurable via the `GOOGLEAI_MODEL` environment variable). | `gemini-2.5-flash-lite` |
38
38
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |
| `key` | Y | API key for Huggingface. | `mykey` |
37
-
| `model` | N | The Huggingface LLM to use. Defaults to `deepseek-ai/DeepSeek-R1-Distill-Qwen-32B` (configurable via `DAPR_CONVERSATION_HUGGINGFACE_MODEL` environment variable). | `${{DAPR_CONVERSATION_HUGGINGFACE_MODEL}}` |
37
+
| `model` | N | The Huggingface LLM to use. Defaults to `deepseek-ai/DeepSeek-R1-Distill-Qwen-32B` (configurable via the `HUGGINGFACE_MODEL` environment variable). | `deepseek-ai/DeepSeek-R1-Distill-Qwen-32B` |
38
38
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |
39
39
40
40
## Related links
41
41
42
-
- [Conversation API overview]({{< ref conversation-overview.md >}})
42
+
- [Conversation API overview]({{< ref conversation-overview.md >}})
| `model` | N | The Mistral LLM to use. Defaults to `open-mistral-7b` (configurable via `DAPR_CONVERSATION_MISTRAL_MODEL` environment variable). | `${{DAPR_CONVERSATION_MISTRAL_MODEL}}` |
37
+
| `model` | N | The Mistral LLM to use. Defaults to `open-mistral-7b` (configurable via the `MISTRAL_MODEL` environment variable). | `open-mistral-7b` |
38
38
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |
39
39
40
40
## Related links
41
41
42
-
- [Conversation API overview]({{< ref conversation-overview.md >}})
42
+
- [Conversation API overview]({{< ref conversation-overview.md >}})
| `model` | N | The Ollama LLM to use. Defaults to `llama3.2:latest` (configurable via `DAPR_CONVERSATION_OLLAMA_MODEL` environment variable). | `${{DAPR_CONVERSATION_OLLAMA_MODEL}}` |
34
+
| `model` | N | The Ollama LLM to use. Defaults to `llama3.2:latest` (configurable via the `OLLAMA_MODEL` environment variable). | `llama3.2:latest` |
35
35
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |
| `model` | N | The OpenAI LLM to use. Defaults to `gpt-5-nano` (configurable via `DAPR_CONVERSATION_OPENAI_MODEL` environment variable). | `${{DAPR_CONVERSATION_OPENAI_MODEL}}` |
37
+
| `model` | N | The OpenAI LLM to use. Defaults to `gpt-5-nano` (configurable via the `OPENAI_MODEL` environment variable). | `gpt-5-nano` |
38
38
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |
39
39
40
+
## Azure OpenAI usage
41
+
42
+
The `conversation.openai` component can target either OpenAI's hosted API or Azure OpenAI. To select Azure OpenAI, set the component's `apiType` metadata to `azure` and provide the usual Azure-specific connection settings (for example, endpoint/region and API key) in the component configuration.
43
+
44
+
When `apiType: azure` is used, the environment variable `AZURE_OPENAI_MODEL` may be set to provide a default Azure model identifier to use when the component's `model` metadata is not provided. This environment variable only affects the component when `apiType` is set to `azure` — the regular `DAPR_CONVERSATION_OPENAI_MODEL` remains the default for non-Azure OpenAI usage.
If `model` is omitted from the component metadata and neither `AZURE_OPENAI_MODEL` nor `DAPR_CONVERSATION_OPENAI_MODEL` are set, the component falls back to its built-in default model.
63
+
40
64
## Related links
41
65
42
-
- [Conversation API overview]({{< ref conversation-overview.md >}})
66
+
- [Conversation API overview]({{< ref conversation-overview.md >}})
Copy file name to clipboardExpand all lines: daprdocs/content/en/reference/environment/_index.md
+10-2Lines changed: 10 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -21,12 +21,20 @@ The following table lists the environment variables used by the Dapr runtime, CL
21
21
| SSL_CERT_DIR | Dapr sidecar | Specifies the location where the public certificates for all the trusted certificate authorities (CA) are located. Not applicable when the sidecar is running as a process in self-hosted mode.|
22
22
| DAPR_HELM_REPO_URL | Your private Dapr Helm chart url | Specifies a private Dapr Helm chart url, which defaults to the official Helm chart URL: `https://dapr.github.io/helm-charts`|
23
23
| DAPR_HELM_REPO_USERNAME | A username for a private Helm chart | The username required to access the private Dapr Helm chart. If it can be accessed publicly, this env variable does not need to be set|
24
-
| DAPR_HELM_REPO_PASSWORD | A password for a private Helm chart |The password required to access the private Dapr helm chart. If it can be accessed publicly, this env variable does not need to be set|
24
+
| DAPR_HELM_REPO_PASSWORD | A password for a private Helm chart |The password required to access the private Dapr helm chart. If it can be accessed publicly, this env variable does not need to be set|
25
25
| OTEL_EXPORTER_OTLP_ENDPOINT | OpenTelemetry Tracing | Sets the Open Telemetry (OTEL) server address, turns on tracing. (Example: `http://localhost:4318`) |
26
26
| OTEL_EXPORTER_OTLP_INSECURE | OpenTelemetry Tracing | Sets the connection to the endpoint as unencrypted. (`true`, `false`) |
27
27
| OTEL_EXPORTER_OTLP_PROTOCOL | OpenTelemetry Tracing | The OTLP protocol to use Transport protocol. (`grpc`, `http/protobuf`, `http/json`) |
28
28
| DAPR_COMPONENTS_SOCKETS_FOLDER | Dapr runtime and the .NET, Go, and Java pluggable component SDKs | The location or path where Dapr looks for Pluggable Components Unix Domain Socket files. If unset this location defaults to `/tmp/dapr-components-sockets`|
29
29
| DAPR_COMPONENTS_SOCKETS_EXTENSION | .NET and Java pluggable component SDKs | A per-SDK configuration that indicates the default file extension applied to socket files created by the SDKs. Not a Dapr-enforced behavior. |
30
30
| DAPR_PLACEMENT_METADATA_ENABLED | Dapr placement | Enable an endpoint for the Placement service that exposes placement table information on actor usage. Set to `true` to enable in self-hosted mode. [Learn more about the Placement API]({{< ref placement_api.md >}}) |
31
31
| DAPR_HOST_IP | Dapr sidecar | The host's chosen IP address. If not specified, will loop over the network interfaces and select the first non-loopback address it finds.|
32
-
| DAPR_HEALTH_TIMEOUT | SDKs | Sets the time on the "wait for sidecar" availability. Overrides the default timeout setting of 60 seconds. |
32
+
| DAPR_HEALTH_TIMEOUT | SDKs | Sets the time on the "wait for sidecar" availability. Overrides the default timeout setting of 60 seconds. |
33
+
| OPENAI_MODEL | Conversation components | Default model name used by the `conversation.openai` component at runtime when no `model` metadata is set in the component file. Default: `gpt-5-nano`. |
34
+
| AZURE_OPENAI_MODEL | Conversation components / Azure config | Default Azure model name used by the `conversation.openai` component when `apiType: azure` is configured and no `model` metadata is provided. Default: `gpt-4.1-mini`. |
35
+
| ANTHROPIC_MODEL | Conversation components | Default model name used by the `conversation.anthropic` component when no `model` metadata is set in the component file. Default: `claude-sonnet-4-20250514`. |
36
+
| GOOGLEAI_MODEL | Conversation components | Default model name used by the `conversation.googleai` component when no `model` metadata is set in the component file. Default: `gemini-2.5-flash-lite`. |
37
+
| MISTRAL_MODEL | Conversation components | Default model name used by the `conversation.mistral` component when no `model` metadata is set in the component file. Default: `open-mistral-7b`. |
38
+
| HUGGINGFACE_MODEL | Conversation components | Default model name used by the `conversation.huggingface` component when no `model` metadata is set in the component file. Default: `deepseek-ai/DeepSeek-R1-Distill-Qwen-32B`. |
39
+
| OLLAMA_MODEL | Conversation components | Default model name used by the `conversation.ollama` component when no `model` metadata is set in the component file. Default: `llama3.2:latest`. |
0 commit comments