Skip to content

Add onnx infer tipc #439

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jun 29, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion test_tipc/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,7 @@ test_tipc/
│ │ └── train_ptq_infer_python.txt # PP-TSM在Linux上进行离线量化推理测试的配置文件
│ ├── PP-TSN/
│ │ ├── train_infer_python.txt # PP-TSN在Linux上进行python训练预测(基础训练预测)的配置文件
│ │ ├── paddle2onnx_infer_python.txt # PP-TSN在Linux上进行Paddle2ONNX预测(基础训练预测)的配置文件
│ │ ├── serving_infer_cpp.txt # PP-TSN在Linux上进行cpp serving测试的配置文件
│ │ └── train_amp_infer_python.txt # PP-TSN在Linux上进行python训练预测(混合精度训练预测)的配置文件
│ ├── ...
Expand All @@ -70,6 +71,7 @@ test_tipc/
├── docs/ # 详细的TIPC各种功能文档
├── test_train_inference_python.sh # 测试python训练预测的主程序
├── test_inference_cpp.sh # 测试C++预测的主程序
├── test_paddle2onnx.sh # 测试paddle2onnx转换与推理的主程序
├── compare_results.py # 用于对比log中的预测结果与results中的预存结果精度误差是否在限定范围内
└── README.md # 介绍文档
```
Expand Down Expand Up @@ -124,7 +126,8 @@ bash test_tipc/test_train_inference_python.sh ./test_tipc/configs/PP-TSM/train_i
- [test_train_inference_python 使用](docs/test_train_inference_python.md) :测试基于Python的模型训练、评估、推理等基本功能。
- [test_amp_train_inference_python 使用](docs/test_train_amp_inference_python.md) :测试基于Python的**混合精度**模型训练、评估、推理等基本功能。
- [test_inference_cpp 使用](docs/test_inference_cpp.md) :测试基于C++的模型推理功能。
- [test_paddle2onnx 使用](docs/test_paddle2onnx.md) :测试基于python2onnx模型的推理功能。
- [test_serving_infer_python 使用](docs/test_serving_infer_python.md) :测试基于Paddle Serving的服务化部署功能。
- [test_serving_infer_cpp 使用](docs/test_serving_infer_cpp.md) :测试基于C++的模型推理功能。
- [test_ptq_inference_python 使用](docs/test_train_ptq_inference_python.md) :测试离线量化训练推理功能。
- [test_train_fleet_inference_python 使用](./docs/test_train_fleet_inference_python.md):测试基于Python的多机多卡训练与推理等基本功能
- [test_train_fleet_inference_python 使用](./docs/test_train_fleet_inference_python.md):测试基于Python的多机多卡训练与推理等基本功能
14 changes: 14 additions & 0 deletions test_tipc/configs/PP-TSN/paddle2onnx_infer_python.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
===========================paddle2onnx_params===========================
model_name:PP-TSN
python:python3.7
2onnx: paddle2onnx
--model_dir:./inference/ppTSN/
--model_filename:ppTSN.pdmodel
--params_filename:ppTSN.pdiparams
--save_file:./inference/ppTSN/ppTSN.onnx
--opset_version:10
--enable_onnx_checker:True
inference:./deploy/paddle2onnx/predict_onnx.py
--config:./configs/recognition/pptsn/pptsn_k400_videos.yaml
--onnx_file:./inference/ppTSN/ppTSN.onnx
--input_file:./data/example.avi
84 changes: 84 additions & 0 deletions test_tipc/docs/test_paddle2onnx.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
# Paddle2ONNX 测试

Paddle2ONNX 测试的主程序为`test_paddle2onnx.sh`,可以测试基于Paddle2ONNX的模型转换和onnx预测功能。


## 1. 测试结论汇总

- 推理相关:

| 算法名称 | 模型名称 | device_CPU | device_GPU | batchsize |
| :----: | :----: | :----: | :----: | :----: |
| PP-TSN | pptsn_k400_videos | 支持 | 支持 | 1 |


## 2. 测试流程

### 2.1 准备数据

用于基础训练推理测试的数据位于`data\example.avi`, 已经存在于目录中无需下载


### 2.2 准备环境


- 安装PaddlePaddle:如果您已经安装了2.2或者以上版本的paddlepaddle,那么无需运行下面的命令安装paddlepaddle。
```
# 需要安装2.2及以上版本的Paddle
# 安装GPU版本的Paddle
python3.7 -m pip install paddlepaddle-gpu==2.2.0
# 安装CPU版本的Paddle
python3.7 -m pip install paddlepaddle==2.2.0
```

- 安装依赖
```
python3.7 -m pip install -r requirements.txt
```

- 安装 Paddle2ONNX
```
python3.7 -m pip install paddle2onnx
```

- 安装 ONNXRuntime
```
# 建议安装 1.9.0 版本,可根据环境更换版本号
python3.7 -m pip install onnxruntime==1.9.0
```


### 2.3 功能测试

测试方法如下所示,希望测试不同的模型文件,只需更换为自己的参数配置文件,即可完成对应模型的测试。

```bash
bash test_tipc/test_paddle2onnx.sh ${your_params_file}
```

以`PP-TSN`的`Paddle2ONNX 测试`为例,命令如下所示。

```bash
bash test_tipc/prepare.sh test_tipc/configs/PP-TSN/paddle2onnx_infer_python.txt paddle2onnx_infer

bash test_tipc/test_paddle2onnx.sh test_tipc/configs/PP-TSN/paddle2onnx_infer_python.txt
```

输出结果如下,表示命令运行成功。

```
Run successfully with command - paddle2onnx --model_dir=./inference/ppTSN/ --model_filename=ppTSN.pdmodel --params_filename=ppTSN.pdiparams --save_file=./inference/ppTSN/ppTSN.onnx --opset_version=10 --enable_onnx_checker=True!
Run successfully with command - python3.7 ./deploy/paddle2onnx/predict_onnx.py --config=./configs/recognition/pptsn/pptsn_k400_videos.yaml --input_file=./data/example.avi --onnx_file=./inference/ppTSN/ppTSN.onnx > ./log/PP-TSN//paddle2onnx_infer_cpu.log 2>&1 !
```

预测结果会自动保存在 `./log/PP-TSN/paddle2onnx_infer_cpu.log` ,可以看到onnx运行结果:
```
W0524 15:28:29.723601 75410 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 10.2, Runtime API Version: 10.2
W0524 15:28:29.982623 75410 gpu_resources.cc:91] device: 0, cuDNN Version: 7.6.
Inference model(ppTSN)...
Current video file: ./data/example.avi
top-1 class: 5
top-1 score: 0.9998553991317749
```

如果运行失败,也会在终端中输出运行失败的日志信息以及对应的运行命令。可以基于该命令,分析运行失败的原因。
21 changes: 20 additions & 1 deletion test_tipc/prepare.sh
Original file line number Diff line number Diff line change
Expand Up @@ -514,5 +514,24 @@ if [ ${MODE} = "serving_infer_python" ];then
fi

if [ ${MODE} = "paddle2onnx_infer" ];then
echo "Not added into TIPC now."
# install paddle2onnx
python_name_list=$(func_parser_value "${lines[2]}")
IFS='|'
array=(${python_name_list})
python_name=${array[0]}
${python_name} -m pip install paddle2onnx
${python_name} -m pip install onnxruntime==1.9.0

if [ ${model_name} = "PP-TSM" ]; then
echo "Not added into TIPC now."
elif [ ${model_name} = "PP-TSN" ]; then
mkdir -p ./inference
wget -P ./inference/ https://videotag.bj.bcebos.com/PaddleVideo-release2.3/ppTSN.zip
# unzip inference model
pushd ./inference
unzip ppTSN.zip
popd
else
echo "Not added into TIPC now."
fi
fi
80 changes: 80 additions & 0 deletions test_tipc/test_paddle2onnx.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
#!/bin/bash
source test_tipc/common_func.sh

FILENAME=$1

dataline=$(cat ${FILENAME})
lines=(${dataline})
# common params
model_name=$(func_parser_value "${lines[1]}")
python=$(func_parser_value "${lines[2]}")


# parser params
dataline=$(awk 'NR==1, NR==14{print}' $FILENAME)
IFS=$'\n'
lines=(${dataline})

# parser paddle2onnx
model_name=$(func_parser_value "${lines[1]}")
python=$(func_parser_value "${lines[2]}")
padlle2onnx_cmd=$(func_parser_value "${lines[3]}")
infer_model_dir_key=$(func_parser_key "${lines[4]}")
infer_model_dir_value=$(func_parser_value "${lines[4]}")
model_filename_key=$(func_parser_key "${lines[5]}")
model_filename_value=$(func_parser_value "${lines[5]}")
params_filename_key=$(func_parser_key "${lines[6]}")
params_filename_value=$(func_parser_value "${lines[6]}")
save_file_key=$(func_parser_key "${lines[7]}")
save_file_value=$(func_parser_value "${lines[7]}")
opset_version_key=$(func_parser_key "${lines[8]}")
opset_version_value=$(func_parser_value "${lines[8]}")
enable_onnx_checker_key=$(func_parser_key "${lines[9]}")
enable_onnx_checker_value=$(func_parser_value "${lines[9]}")
# parser onnx inference
inference_py=$(func_parser_value "${lines[10]}")
config_key=$(func_parser_key "${lines[11]}")
config_value=$(func_parser_value "${lines[11]}")
model_key=$(func_parser_key "${lines[12]}")
input_file_key=$(func_parser_key "${lines[13]}")
input_file_value=$(func_parser_value "${lines[13]}")


LOG_PATH="./log/${model_name}/${MODE}"
mkdir -p ${LOG_PATH}
status_log="${LOG_PATH}/results_paddle2onnx.log"


function func_paddle2onnx(){
IFS='|'
_script=$1

# paddle2onnx
_save_log_path="${LOG_PATH}/paddle2onnx_infer_cpu.log"
set_dirname=$(func_set_params "${infer_model_dir_key}" "${infer_model_dir_value}")
set_model_filename=$(func_set_params "${model_filename_key}" "${model_filename_value}")
set_params_filename=$(func_set_params "${params_filename_key}" "${params_filename_value}")
set_save_model=$(func_set_params "${save_file_key}" "${save_file_value}")
set_opset_version=$(func_set_params "${opset_version_key}" "${opset_version_value}")
set_enable_onnx_checker=$(func_set_params "${enable_onnx_checker_key}" "${enable_onnx_checker_value}")
trans_model_cmd="${padlle2onnx_cmd} ${set_dirname} ${set_model_filename} ${set_params_filename} ${set_save_model} ${set_opset_version} ${set_enable_onnx_checker}"
eval $trans_model_cmd
last_status=${PIPESTATUS[0]}
status_check $last_status "${trans_model_cmd}" "${status_log}" "${model_name}"
# python inference
set_gpu=$(func_set_params "${use_gpu_key}" "${use_gpu_value}")
set_model_dir=$(func_set_params "${model_key}" "${save_file_value}")
set_input_file=$(func_set_params "${input_file_key}" "${input_file_value}")
set_config=$(func_set_params "${config_key}" "${config_value}")
infer_model_cmd="${python} ${inference_py} ${set_config} ${set_input_file} ${set_model_dir} > ${_save_log_path} 2>&1 "
eval $infer_model_cmd
last_status=${PIPESTATUS[0]}
status_check $last_status "${infer_model_cmd}" "${status_log}" "${model_name}"
}


echo "################### run test ###################"

export Count=0
IFS="|"
func_paddle2onnx