-
Notifications
You must be signed in to change notification settings - Fork 465
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Model] Add Solov2 For PaddleDetection #1435
Conversation
仅测试CI,暂时还未完成 |
@DefTruth 大佬帮忙看下哈 |
examples/vision/detection/paddledetection/jetson/cpp/CMakeLists.txt
Outdated
Show resolved
Hide resolved
examples/vision/detection/paddledetection/jetson/cpp/README_CN.md
Outdated
Show resolved
Hide resolved
@DefTruth 可视化效果如下 |
@jiangjiajun 已经按要求修改,只需要看一下最近一次的提交即可 |
const ModelFormat& model_format = ModelFormat::PADDLE) | ||
: PPDetBase(model_file, params_file, config_file, custom_option, | ||
model_format) { | ||
valid_cpu_backends = { Backend::PDINFER}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里目前只测试了Paddle Inference后端,可以考虑增加ORT/OV等后端的测试
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jiangjiajun @DefTruth ORT/OV我看了一下应该都是用OINNX去作为模型格式去推理的?但是目前solov2转ONNX有部分算子不支持,这个可能要等Paddle2ONNX那边解决下?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
直接跑OpenVINO(不需要转ONNX),如果可以支持,需要添加到CPU Backends里面
按照要求修改 |
:return: a new SOLOv2 object | ||
""" | ||
|
||
class SOLOv2Clone(YOLOv3): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里依然是 YOLOv3,需要改成SOLOv2
@jiangjiajun 大佬,你看下哈,还有什么问题我再修改 |
@@ -0,0 +1,21 @@ | |||
English | [简体中文](README_CN.md) | |||
|
|||
# PaddleDetection Model Deployment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
为什么需要额外新增一个jetson的部署目录,复用原cpp目录不行么
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PaddlePaddle/Paddle#50631 (comment) 这里任务要求是放到这个目录
const ModelFormat& model_format = ModelFormat::PADDLE) | ||
: PPDetBase(model_file, params_file, config_file, custom_option, | ||
model_format) { | ||
valid_cpu_backends = { Backend::PDINFER}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
直接跑OpenVINO(不需要转ONNX),如果可以支持,需要添加到CPU Backends里面
直接跑OpenVINO似乎有个模型转换的问题?OpenVINO支持Paddle Inference模型直接转成官方的OpenVINO模型吗? |
先将Openvino后端添加到valid_cpu_backends中,然后直接option.use_openvino_backend()试下 |
我这边暂时没有x86的英特尔CPU机器,要不然先合入大佬你pull下来测试一下?我手头是mac M1没法用OPENVINO |
那使用aistudio进行测试吧 |
我去申请一下试试,AIStudio好像要用专门的开发环境才能编译FastDeploy |
选择CPU环境即可(编译FD时也无需开启GPU,只开启CPU就行) |
测试过了,OPENVINO没法用
|
@Zheng-Bicheng 这个问题解决了吗?我在测试一个 vit 模型的时候也碰到类似问题。 |
看报错应该是OPENVINO 有部分算子没有适配?我不太了解OPENVINO,这个问问蒋巨佬? @jiangjiajun |
PR types(PR类型)
Model
Description
添加Solov2