Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for YOLOv5 P6 Models #281

Closed
JohannaSopraSteria opened this issue Oct 11, 2022 · 14 comments · Fixed by #284
Closed

Support for YOLOv5 P6 Models #281

JohannaSopraSteria opened this issue Oct 11, 2022 · 14 comments · Fixed by #284
Assignees
Labels
bug Something isn't working tensorflow.js Issues related to local inference.

Comments

@JohannaSopraSteria
Copy link

Hi! First and foremost, thanks for adding support for YOLOv5 models 😃

For my project I am using a YOLOv5 P6 model. When I try to upload this model in the specified format I get this error message:
image

It seems that so far Make Sense, only accepts the original YOLOv5 P3-P5, as this works fine when I upload such a model. Would it be possible for you to extend this so that YOLOv5 P6 models also can be used to annotate images?

@github-actions
Copy link

👋 Hello @JohannaSopraSteria, thank you for your interest in make-sense - free to use online tool for labelling photos! 🏷️

🐞 Bug reports

If you noticed that make-sense is not working properly, please provide us with as much information as possible. To make your life easier, we have prepared a bug report template containing all the relevant details. We know, we ask for a lot... However, please believe that knowing all that extra information - like the type of browser you use or the version of node you have installed - really helps us to solve your problems faster and more efficiently. 😉

💬 Get in touch

If you've been trying to contact us but for some reason we haven't responded to your issue yet, don't hesitate to get back to us on Gitter or Twitter.

💻 Local setup

# clone repository
git clone https://github.com/SkalskiP/make-sense.git

# navigate to main dir
cd make-sense

# install dependencies
npm install

# serve with hot reload at localhost:3000
npm start

To ensure proper functionality of the application locally, an npm 6.x.x and node.js v12.x.x versions are required. More information about this problem is available in the #16 issue.

@SkalskiP
Copy link
Owner

SkalskiP commented Oct 11, 2022

Hi, @JohannaSopraSteria 👋! Thanks for letting me know. From what I can see the model fails during inference, not model loading.

First of all, I need to replicate the error. Which weights are you using precisely? yolov5s6?

@SkalskiP SkalskiP self-assigned this Oct 11, 2022
@SkalskiP SkalskiP added bug Something isn't working tensorflow.js Issues related to local inference. labels Oct 11, 2022
@JohannaSopraSteria
Copy link
Author

Correct, I'm using yolov5s6

@SkalskiP
Copy link
Owner

I just tested yolov5s6. I converted it to tensorflow.js format python export.py --weights yolov5s6.pt --include tfjs and it worked.

So I think there might be another reason. Do you use a custom model (your own, trained on some custom dataset)?

@SkalskiP
Copy link
Owner

@JohannaSopraSteria could you share with me over email or here, your model along with the txt file containing class names?

@JohannaSopraSteria
Copy link
Author

@SkalskiP I've shared the file with you via mail.

@SkalskiP
Copy link
Owner

Hello @JohannaSopraSteria! I'm afraid I haven't got anything. :(

@SkalskiP
Copy link
Owner

I have it. It was in spam. Let me take a look :)

@SkalskiP
Copy link
Owner

@JohannaSopraSteria I know what seems to be the problem - input resolution. I hardcoded the input resolution 640, as it was working for all the models I tested. Your model requires 1280 for some reason. Now I need to think about how to fix it.

  • Hardcode it as 1280 - the problem is that the inference will be slower for other YOLOv5 models.
  • Allow the user to select resolution. Will take me some time to do it right. Not to mention that you could still select the incorrect one.
  • Try to detect if the model requires a larger input resolution somehow.

SkalskiP added a commit that referenced this issue Oct 12, 2022
@SkalskiP SkalskiP linked a pull request Oct 12, 2022 that will close this issue
3 tasks
@SkalskiP
Copy link
Owner

Hi @JohannaSopraSteria! I guess I managed to fix model loading problems. Please take a look at this test version of Make Sense and let me know if it works :)

@JohannaSopraSteria
Copy link
Author

Hi @SkalskiP :) Unfortunately I still get the "Inference failed" message :/ I'm still using the same model I sent you.

@SkalskiP
Copy link
Owner

Oh sorry, @JohannaSopraSteria it looks like I screwed up links...

Please take a look here: https://fix-281-support-for-yolov5-p6-models.d2e9l8xwkaodsq.amplifyapp.com/

@JohannaSopraSteria
Copy link
Author

Works perfectly 😁 Well done @SkalskiP !

@SkalskiP
Copy link
Owner

This is great @JohannaSopraSteria! 🎉 I'm really happy we got it to work! Thank you very much for bringing that problem to my attention.

We are not yet ready to release the new version of Make Sense. So for now you can access the fixed editor here: https://develop.makesense.ai. It is our test instance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working tensorflow.js Issues related to local inference.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants