Skip to content

Commit bed9d6b

Browse files
authored
upgrade ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 to cpu-1.7 (#2098)
Signed-off-by: chensuyue <suyue.chen@intel.com>
1 parent 23f03af commit bed9d6b

37 files changed

+72
-70
lines changed

AgentQnA/docker_compose/amd/gpu/rocm/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ We remind you that when using a specific version of the code, you need to use th
8686

8787
```bash
8888
docker pull redis/redis-stack:7.2.0-v9
89-
docker pull ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
89+
docker pull ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
9090
```
9191

9292
After the build, we check the list of images with the command:
@@ -102,7 +102,7 @@ We remind you that when using a specific version of the code, you need to use th
102102
- opea/vllm-rocm:latest
103103
- opea/agent:latest
104104
- redis/redis-stack:7.2.0-v9
105-
- ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
105+
- ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
106106
- opea/embedding:latest
107107
- opea/retriever:latest
108108
- opea/reranking:latest
@@ -113,7 +113,7 @@ We remind you that when using a specific version of the code, you need to use th
113113
- ghcr.io/huggingface/text-generation-inference:2.3.1-rocm
114114
- opea/agent:latest
115115
- redis/redis-stack:7.2.0-v9
116-
- ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
116+
- ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
117117
- opea/embedding:latest
118118
- opea/retriever:latest
119119
- opea/reranking:latest

ChatQnA/docker_compose/amd/gpu/rocm/README.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -164,10 +164,10 @@ eaf24161aca8 opea/nginx:latest "/docker-
164164
613c384979f4 opea/chatqna:latest "bash entrypoint.sh" 37 seconds ago Up 5 seconds 0.0.0.0:18102->8888/tcp, [::]:18102->8888/tcp chatqna-backend-server
165165
05512bd29fee opea/dataprep:latest "sh -c 'python $( [ …" 37 seconds ago Up 36 seconds (healthy) 0.0.0.0:18103->5000/tcp, [::]:18103->5000/tcp chatqna-dataprep-service
166166
49844d339d1d opea/retriever:latest "python opea_retriev…" 37 seconds ago Up 36 seconds 0.0.0.0:7000->7000/tcp, [::]:7000->7000/tcp chatqna-retriever
167-
75b698fe7de0 ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18808->80/tcp, [::]:18808->80/tcp chatqna-tei-reranking-service
167+
75b698fe7de0 ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18808->80/tcp, [::]:18808->80/tcp chatqna-tei-reranking-service
168168
342f01bfdbb2 ghcr.io/huggingface/text-generation-inference:2.3.1-rocm"python3 /workspace/…" 37 seconds ago Up 36 seconds 0.0.0.0:18008->8011/tcp, [::]:18008->8011/tcp chatqna-tgi-service
169169
6081eb1c119d redis/redis-stack:7.2.0-v9 "/entrypoint.sh" 37 seconds ago Up 36 seconds 0.0.0.0:6379->6379/tcp, [::]:6379->6379/tcp, 0.0.0.0:8001->8001/tcp, [::]:8001->8001/tcp chatqna-redis-vector-db
170-
eded17420782 ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18090->80/tcp, [::]:18090->80/tcp chatqna-tei-embedding-service
170+
eded17420782 ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18090->80/tcp, [::]:18090->80/tcp chatqna-tei-embedding-service
171171
```
172172

173173
if used TGI with FaqGen:
@@ -180,10 +180,10 @@ eaf24161aca8 opea/nginx:latest "/docker-
180180
e0ef1ea67640 opea/llm-faqgen:latest "bash entrypoint.sh" 37 seconds ago Up 36 seconds 0.0.0.0:18011->9000/tcp, [::]:18011->9000/tcp chatqna-llm-faqgen
181181
05512bd29fee opea/dataprep:latest "sh -c 'python $( [ …" 37 seconds ago Up 36 seconds (healthy) 0.0.0.0:18103->5000/tcp, [::]:18103->5000/tcp chatqna-dataprep-service
182182
49844d339d1d opea/retriever:latest "python opea_retriev…" 37 seconds ago Up 36 seconds 0.0.0.0:7000->7000/tcp, [::]:7000->7000/tcp chatqna-retriever
183-
75b698fe7de0 ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18808->80/tcp, [::]:18808->80/tcp chatqna-tei-reranking-service
183+
75b698fe7de0 ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18808->80/tcp, [::]:18808->80/tcp chatqna-tei-reranking-service
184184
342f01bfdbb2 ghcr.io/huggingface/text-generation-inference:2.3.1-rocm"python3 /workspace/…" 37 seconds ago Up 36 seconds 0.0.0.0:18008->8011/tcp, [::]:18008->8011/tcp chatqna-tgi-service
185185
6081eb1c119d redis/redis-stack:7.2.0-v9 "/entrypoint.sh" 37 seconds ago Up 36 seconds 0.0.0.0:6379->6379/tcp, [::]:6379->6379/tcp, 0.0.0.0:8001->8001/tcp, [::]:8001->8001/tcp chatqna-redis-vector-db
186-
eded17420782 ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18090->80/tcp, [::]:18090->80/tcp chatqna-tei-embedding-service
186+
eded17420782 ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18090->80/tcp, [::]:18090->80/tcp chatqna-tei-embedding-service
187187
```
188188

189189
if used vLLM:
@@ -195,10 +195,10 @@ eaf24161aca8 opea/nginx:latest "/docker-
195195
613c384979f4 opea/chatqna:latest "bash entrypoint.sh" 37 seconds ago Up 5 seconds 0.0.0.0:18102->8888/tcp, [::]:18102->8888/tcp chatqna-backend-server
196196
05512bd29fee opea/dataprep:latest "sh -c 'python $( [ …" 37 seconds ago Up 36 seconds (healthy) 0.0.0.0:18103->5000/tcp, [::]:18103->5000/tcp chatqna-dataprep-service
197197
49844d339d1d opea/retriever:latest "python opea_retriev…" 37 seconds ago Up 36 seconds 0.0.0.0:7000->7000/tcp, [::]:7000->7000/tcp chatqna-retriever
198-
75b698fe7de0 ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18808->80/tcp, [::]:18808->80/tcp chatqna-tei-reranking-service
198+
75b698fe7de0 ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18808->80/tcp, [::]:18808->80/tcp chatqna-tei-reranking-service
199199
342f01bfdbb2 opea/vllm-rocm:latest "python3 /workspace/…" 37 seconds ago Up 36 seconds 0.0.0.0:18008->8011/tcp, [::]:18008->8011/tcp chatqna-vllm-service
200200
6081eb1c119d redis/redis-stack:7.2.0-v9 "/entrypoint.sh" 37 seconds ago Up 36 seconds 0.0.0.0:6379->6379/tcp, [::]:6379->6379/tcp, 0.0.0.0:8001->8001/tcp, [::]:8001->8001/tcp chatqna-redis-vector-db
201-
eded17420782 ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18090->80/tcp, [::]:18090->80/tcp chatqna-tei-embedding-service
201+
eded17420782 ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18090->80/tcp, [::]:18090->80/tcp chatqna-tei-embedding-service
202202
```
203203

204204
if used vLLM with FaqGen:
@@ -211,10 +211,10 @@ eaf24161aca8 opea/nginx:latest "/docker-
211211
e0ef1ea67640 opea/llm-faqgen:latest "bash entrypoint.sh" 37 seconds ago Up 36 seconds 0.0.0.0:18011->9000/tcp, [::]:18011->9000/tcp chatqna-llm-faqgen
212212
05512bd29fee opea/dataprep:latest "sh -c 'python $( [ …" 37 seconds ago Up 36 seconds (healthy) 0.0.0.0:18103->5000/tcp, [::]:18103->5000/tcp chatqna-dataprep-service
213213
49844d339d1d opea/retriever:latest "python opea_retriev…" 37 seconds ago Up 36 seconds 0.0.0.0:7000->7000/tcp, [::]:7000->7000/tcp chatqna-retriever
214-
75b698fe7de0 ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18808->80/tcp, [::]:18808->80/tcp chatqna-tei-reranking-service
214+
75b698fe7de0 ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18808->80/tcp, [::]:18808->80/tcp chatqna-tei-reranking-service
215215
342f01bfdbb2 opea/vllm-rocm:latest "python3 /workspace/…" 37 seconds ago Up 36 seconds 0.0.0.0:18008->8011/tcp, [::]:18008->8011/tcp chatqna-vllm-service
216216
6081eb1c119d redis/redis-stack:7.2.0-v9 "/entrypoint.sh" 37 seconds ago Up 36 seconds 0.0.0.0:6379->6379/tcp, [::]:6379->6379/tcp, 0.0.0.0:8001->8001/tcp, [::]:8001->8001/tcp chatqna-redis-vector-db
217-
eded17420782 ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18090->80/tcp, [::]:18090->80/tcp chatqna-tei-embedding-service
217+
eded17420782 ghcr.io/huggingface/text-embeddings-inference:cpu-1.7 "text-embeddings-rou…" 37 seconds ago Up 36 seconds 0.0.0.0:18090->80/tcp, [::]:18090->80/tcp chatqna-tei-embedding-service
218218
```
219219

220220
If any issues are encountered during deployment, refer to the [Troubleshooting](../../../../README_miscellaneous.md#troubleshooting) section.

ChatQnA/docker_compose/amd/gpu/rocm/compose_faqgen_vllm.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ services:
3333
restart: unless-stopped
3434

3535
chatqna-tei-embedding-service:
36-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
36+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
3737
container_name: chatqna-tei-embedding-service
3838
ports:
3939
- "${CHATQNA_TEI_EMBEDDING_PORT}:80"
@@ -68,7 +68,7 @@ services:
6868
restart: unless-stopped
6969

7070
chatqna-tei-reranking-service:
71-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
71+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
7272
container_name: chatqna-tei-reranking-service
7373
ports:
7474
- "${CHATQNA_TEI_RERANKING_PORT}:80"

ChatQnA/docker_compose/amd/gpu/rocm/compose_vllm.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ services:
3333
restart: unless-stopped
3434

3535
chatqna-tei-embedding-service:
36-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
36+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
3737
container_name: chatqna-tei-embedding-service
3838
ports:
3939
- "${CHATQNA_TEI_EMBEDDING_PORT}:80"
@@ -66,7 +66,7 @@ services:
6666
restart: unless-stopped
6767

6868
chatqna-tei-reranking-service:
69-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
69+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
7070
container_name: chatqna-tei-reranking-service
7171
ports:
7272
- "${CHATQNA_TEI_RERANKING_PORT}:80"

ChatQnA/docker_compose/intel/cpu/aipc/compose.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ services:
3232
retries: 50
3333
restart: unless-stopped
3434
tei-embedding-service:
35-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
35+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
3636
container_name: tei-embedding-server
3737
ports:
3838
- "6006:80"
@@ -65,7 +65,7 @@ services:
6565
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
6666
restart: unless-stopped
6767
tei-reranking-service:
68-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
68+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
6969
container_name: tei-reranking-server
7070
ports:
7171
- "8808:80"

ChatQnA/docker_compose/intel/cpu/xeon/compose.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ services:
3939
retries: 50
4040
restart: unless-stopped
4141
tei-embedding-service:
42-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
42+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
4343
container_name: tei-embedding-server
4444
ports:
4545
- "6006:80"
@@ -72,7 +72,7 @@ services:
7272
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
7373
restart: unless-stopped
7474
tei-reranking-service:
75-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
75+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
7676
container_name: tei-reranking-server
7777
ports:
7878
- "8808:80"

ChatQnA/docker_compose/intel/cpu/xeon/compose_faqgen.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ services:
3232
retries: 50
3333
restart: unless-stopped
3434
tei-embedding-service:
35-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
35+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
3636
container_name: tei-embedding-server
3737
ports:
3838
- "6006:80"
@@ -65,7 +65,7 @@ services:
6565
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
6666
restart: unless-stopped
6767
tei-reranking-service:
68-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
68+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
6969
container_name: tei-reranking-server
7070
ports:
7171
- "8808:80"

ChatQnA/docker_compose/intel/cpu/xeon/compose_faqgen_tgi.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ services:
3232
retries: 50
3333
restart: unless-stopped
3434
tei-embedding-service:
35-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
35+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
3636
container_name: tei-embedding-server
3737
ports:
3838
- "6006:80"
@@ -65,7 +65,7 @@ services:
6565
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_REDIS"
6666
restart: unless-stopped
6767
tei-reranking-service:
68-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
68+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
6969
container_name: tei-reranking-server
7070
ports:
7171
- "8808:80"

ChatQnA/docker_compose/intel/cpu/xeon/compose_mariadb.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ services:
4343
retries: 50
4444
restart: unless-stopped
4545
tei-embedding-service:
46-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
46+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
4747
container_name: tei-embedding-server
4848
ports:
4949
- "6006:80"
@@ -74,7 +74,7 @@ services:
7474
RETRIEVER_COMPONENT_NAME: "OPEA_RETRIEVER_MARIADBVECTOR"
7575
restart: unless-stopped
7676
tei-reranking-service:
77-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
77+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
7878
container_name: tei-reranking-server
7979
ports:
8080
- "8808:80"

ChatQnA/docker_compose/intel/cpu/xeon/compose_milvus.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,7 @@ services:
113113
restart: unless-stopped
114114

115115
tei-embedding-service:
116-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
116+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
117117
container_name: tei-embedding-server
118118
ports:
119119
- "6006:80"
@@ -127,7 +127,7 @@ services:
127127
command: --model-id ${EMBEDDING_MODEL_ID} --auto-truncate
128128

129129
tei-reranking-service:
130-
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
130+
image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.7
131131
container_name: tei-reranking-server
132132
ports:
133133
- "8808:80"

0 commit comments

Comments
 (0)