Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Model] Add Solov2 For PaddleDetection #1435

Merged
merged 28 commits into from
Mar 8, 2023
Merged

[Model] Add Solov2 For PaddleDetection #1435

merged 28 commits into from
Mar 8, 2023

Conversation

Zheng-Bicheng
Copy link
Collaborator

PR types(PR类型)

Model

Description

添加Solov2

@Zheng-Bicheng
Copy link
Collaborator Author

仅测试CI,暂时还未完成

@Zheng-Bicheng
Copy link
Collaborator Author

@DefTruth 大佬帮忙看下哈

@Zheng-Bicheng
Copy link
Collaborator Author

@DefTruth 可视化效果如下

vis_result

@Zheng-Bicheng
Copy link
Collaborator Author

@jiangjiajun 已经按要求修改,只需要看一下最近一次的提交即可

fastdeploy/runtime/backends/paddle/util.cc Outdated Show resolved Hide resolved
const ModelFormat& model_format = ModelFormat::PADDLE)
: PPDetBase(model_file, params_file, config_file, custom_option,
model_format) {
valid_cpu_backends = { Backend::PDINFER};
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里目前只测试了Paddle Inference后端,可以考虑增加ORT/OV等后端的测试

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jiangjiajun @DefTruth ORT/OV我看了一下应该都是用OINNX去作为模型格式去推理的?但是目前solov2转ONNX有部分算子不支持,这个可能要等Paddle2ONNX那边解决下?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

直接跑OpenVINO(不需要转ONNX),如果可以支持,需要添加到CPU Backends里面

fastdeploy/vision/detection/ppdet/preprocessor.cc Outdated Show resolved Hide resolved
@Zheng-Bicheng
Copy link
Collaborator Author

按照要求修改

:return: a new SOLOv2 object
"""

class SOLOv2Clone(YOLOv3):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里依然是 YOLOv3,需要改成SOLOv2

@Zheng-Bicheng
Copy link
Collaborator Author

@jiangjiajun 大佬,你看下哈,还有什么问题我再修改

@@ -0,0 +1,21 @@
English | [简体中文](README_CN.md)

# PaddleDetection Model Deployment
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

为什么需要额外新增一个jetson的部署目录,复用原cpp目录不行么

Copy link
Collaborator Author

@Zheng-Bicheng Zheng-Bicheng Mar 6, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PaddlePaddle/Paddle#50631 (comment) 这里任务要求是放到这个目录

const ModelFormat& model_format = ModelFormat::PADDLE)
: PPDetBase(model_file, params_file, config_file, custom_option,
model_format) {
valid_cpu_backends = { Backend::PDINFER};
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

直接跑OpenVINO(不需要转ONNX),如果可以支持,需要添加到CPU Backends里面

@Zheng-Bicheng
Copy link
Collaborator Author

Zheng-Bicheng commented Mar 6, 2023

直接跑OpenVINO似乎有个模型转换的问题?OpenVINO支持Paddle Inference模型直接转成官方的OpenVINO模型吗?

@jiangjiajun
Copy link
Collaborator

先将Openvino后端添加到valid_cpu_backends中,然后直接option.use_openvino_backend()试下

@Zheng-Bicheng
Copy link
Collaborator Author

先将Openvino后端添加到valid_cpu_backends中,然后直接option.use_openvino_backend()试下

我这边暂时没有x86的英特尔CPU机器,要不然先合入大佬你pull下来测试一下?我手头是mac M1没法用OPENVINO

@jiangjiajun
Copy link
Collaborator

先将Openvino后端添加到valid_cpu_backends中,然后直接option.use_openvino_backend()试下

我这边暂时没有x86的英特尔CPU机器,要不然先合入大佬你pull下来测试一下?我手头是mac M1没法用OPENVINO

那使用aistudio进行测试吧

@Zheng-Bicheng
Copy link
Collaborator Author

那使用aistudio进行测试吧

我去申请一下试试,AIStudio好像要用专门的开发环境才能编译FastDeploy

@jiangjiajun
Copy link
Collaborator

选择CPU环境即可(编译FD时也无需开启GPU,只开启CPU就行)

@Zheng-Bicheng
Copy link
Collaborator Author

测试过了,OPENVINO没法用

(base) zbc@pop-os:~/FastDeploy/examples/vision/detection/paddledetection/jetson/cpp/build$ ./infer_solov2_demo ./solov2_r50_fpn_1x_coco 000000014439.jpg 0
[INFO] fastdeploy/vision/common/processors/transform.cc(159)::FuseNormalizeColorConvert	BGR2RGB and Normalize are fused to Normalize with swap_rb=1
terminate called after throwing an instance of 'ov::Exception'
  what():  Check 'creator_it != CREATORS_MAP.end()' failed at src/frontends/paddle/src/frontend.cpp:46:
FrontEnd API failed with OpConversionFailure: :
No creator found for linspace node.

@jiangjiajun jiangjiajun merged commit 0687d3b into PaddlePaddle:develop Mar 8, 2023
@Zheng-Bicheng Zheng-Bicheng deleted the solov2 branch March 8, 2023 02:04
@janus-zheng
Copy link

测试过了,OPENVINO没法用

(base) zbc@pop-os:~/FastDeploy/examples/vision/detection/paddledetection/jetson/cpp/build$ ./infer_solov2_demo ./solov2_r50_fpn_1x_coco 000000014439.jpg 0
[INFO] fastdeploy/vision/common/processors/transform.cc(159)::FuseNormalizeColorConvert	BGR2RGB and Normalize are fused to Normalize with swap_rb=1
terminate called after throwing an instance of 'ov::Exception'
  what():  Check 'creator_it != CREATORS_MAP.end()' failed at src/frontends/paddle/src/frontend.cpp:46:
FrontEnd API failed with OpConversionFailure: :
No creator found for linspace node.

@Zheng-Bicheng 这个问题解决了吗?我在测试一个 vit 模型的时候也碰到类似问题。

@Zheng-Bicheng
Copy link
Collaborator Author

@Zheng-Bicheng 这个问题解决了吗?我在测试一个 vit 模型的时候也碰到类似问题。

看报错应该是OPENVINO 有部分算子没有适配?我不太了解OPENVINO,这个问问蒋巨佬? @jiangjiajun

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants