Skip to content

Commit 023d02d

Browse files
authored
[cherry-pick] update paddle2onnx doc (#14051)
1 parent ee1aa57 commit 023d02d

File tree

2 files changed

+12
-0
lines changed

2 files changed

+12
-0
lines changed

docs/ppocr/infer_deploy/paddle2onnx.en.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -92,6 +92,12 @@ After execution, the ONNX model will be saved in `./inference/det_onnx/`, `./inf
9292

9393
In addition, the following models do not currently support conversion to ONNX models: NRTR, SAR, RARE, SRN.
9494

95+
If you have optimization needs for the exported ONNX model, we recommend using `onnxslim`.
96+
```bash linenums="1"
97+
pip install onnxslim
98+
onnxslim model.onnx slim.onnx
99+
```
100+
95101
## 3. prediction
96102

97103
Take the English OCR model as an example, use **ONNXRuntime** to predict and execute the following commands:

docs/ppocr/infer_deploy/paddle2onnx.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -97,6 +97,12 @@ paddle2onnx --model_dir ./inference/ch_ppocr_mobile_v2.0_cls_infer \
9797
--input_shape_dict "{'x': [-1,3,-1,-1]}"
9898
```
9999

100+
如你对导出的 ONNX 模型有优化的需求,推荐使用 `onnxslim` 对模型进行优化:
101+
```bash linenums="1"
102+
pip install onnxslim
103+
onnxslim model.onnx slim.onnx
104+
```
105+
100106
## 3. 推理预测
101107

102108
以中文OCR模型为例,使用 ONNXRuntime 预测可执行如下命令:

0 commit comments

Comments
 (0)