Skip to content

Commit 897b861

Browse files
committed
template yaml file
1 parent b3b57e8 commit 897b861

File tree

2 files changed

+8
-6
lines changed

2 files changed

+8
-6
lines changed

README.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -68,12 +68,14 @@ English tutorials(comming soon...)
6868
To test or visit the website, find out the kubernetes ingress IP
6969
addresses, or the NodePort.
7070

71-
Then open your browser and visit http://<ingress-ip-address>, or
72-
http://<any-node-ip-address>:<NodePort>
71+
Then open your browser and visit `http://<ingress-ip-address>`, or
72+
`http://<any-node-ip-address>:<NodePort>`
7373

7474
- Prepare public dataset
7575

76-
You can create a Kubernetes Job for preparing the public dataset and cluster trainer files.
76+
You can create a Kubernetes Job for preparing the public cloud dataset with RecordIO files. You should modify the YAML file as your environment:
77+
- `<DATACENTER>`, Your cluster datacenter
78+
- `<MONITOR_ADDR>`, Ceph monitor address
7779
```bash
7880
kubectl create -f k8s/prepare_dataset.yaml
7981
```

k8s/prepare_dataset.yaml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,16 +11,16 @@ spec:
1111
- name: data-storage
1212
cephfs:
1313
monitors:
14-
- 192.168.16.23:6789
14+
- <MONITOR_ADDR>
1515
path: "/public"
1616
user: "admin"
1717
secretRef:
1818
name: ceph-secret
1919
containers:
2020
- name: prepare
2121
image: yancey1989/paddlecloud-job
22-
command: ["sh", "-c", "python -c \"import paddle.v2.dataset as dataset; dataset.common.convert('/pfs/dlnel/public/dataset')\""]
22+
command: ["sh", "-c", "python -c \"import paddle.v2.dataset as dataset; dataset.common.convert('/pfs/<DATACENTER>/public/dataset')\""]
2323
volumeMounts:
2424
- name: data-storage
25-
mountPath: /pfs/dlnel/public
25+
mountPath: /pfs/<DATACENTER>/public
2626
restartPolicy: Never

0 commit comments

Comments
 (0)