Skip to content

Commit f7e9c7d

Browse files
committed
add note on HW requirement
Signed-off-by: alexsin368 <alex.sin@intel.com>
1 parent 1f23e47 commit f7e9c7d

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

AgentQnA/README.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -210,9 +210,11 @@ The command below will launch the multi-agent system with the `DocIndexRetriever
210210
docker compose -f $WORKDIR/GenAIExamples/DocIndexRetriever/docker_compose/intel/cpu/xeon/compose.yaml -f compose_openai.yaml up -d
211211
```
212212

213-
##### Models on Remote Server
213+
##### Models on Remote Servers
214214

215-
When models are deployed on a remote server with Intel® AI for Enterprise Inference, a base URL and an API key are required to access them. To run the Agent microservice on Xeon while using models deployed on a remote server, add `compose_remote.yaml` to the `docker compose` command and set additional environment variables.
215+
When models are deployed on a remote server with Intel® AI for Enterprise Inference, a base URL and an API key are required to access them. To run the agent microservice on Xeon while using models deployed on a remote server, add `compose_remote.yaml` to the `docker compose` command and set additional environment variables.
216+
217+
> **Note**: For AgentQnA, the minimum hardware requirement for the remote server is Intel® Gaudi® AI Accelerators.
216218
217219
Set the following environment variables.
218220

0 commit comments

Comments
 (0)