Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Hard sigmoid activation function #2112

Merged
merged 7 commits into from
Aug 7, 2024
Merged

Add Hard sigmoid activation function #2112

merged 7 commits into from
Aug 7, 2024

Conversation

wingertge
Copy link
Contributor

Pull Request Template

Checklist

  • Confirmed that run-checks all script has been executed.
  • Made sure the book is up to date with changes in this PR.

Related Issues/PRs

Required to load the DbNet and PaddlePaddle OCR ONNX models used in RapidOCR.

Changes

  1. Add (configurable) hard sigmoid activation function
  2. Add ONNX import support for the HardSigmoid node

Testing

New unit tests and end-to-end ONNX tests.

@wingertge wingertge changed the title Hard sigmoid Add Hard sigmoid activation function Aug 5, 2024
Copy link
Collaborator

@antimora antimora left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM from my end!

One part that I am not sure about is if we need hard_sigmoid_backward function as well?

@nathanielsimard can you please confirm?

Copy link

codecov bot commented Aug 5, 2024

Codecov Report

Attention: Patch coverage is 98.01325% with 3 lines in your changes missing coverage. Please review.

Project coverage is 86.27%. Comparing base (f7639bd) to head (a9e480f).
Report is 7 commits behind head on main.

Files Patch % Lines
crates/burn-import/src/onnx/op_configuration.rs 80.00% 2 Missing ⚠️
crates/burn-core/src/nn/hard_sigmoid.rs 97.36% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2112      +/-   ##
==========================================
+ Coverage   86.23%   86.27%   +0.04%     
==========================================
  Files         688      690       +2     
  Lines       88465    88634     +169     
==========================================
+ Hits        76285    76473     +188     
+ Misses      12180    12161      -19     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@wingertge
Copy link
Contributor Author

One part that I am not sure about is if we need hard_sigmoid_backward function as well?

The only other activation function with a scalar parameter is leaky_relu which also doesn't have a backwards function so I couldn't figure out how to implement it. If there is a way I'd be glad to add it.

Copy link
Member

@laggui laggui left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding the implementation all the way through ONNX import!

For the backward, I think it's fine for now. We can let autodiff handle it since it's implemented as a combination of other ops.

@antimora antimora merged commit a01004d into tracel-ai:main Aug 7, 2024
14 checks passed
@wingertge wingertge deleted the hard_sigmoid branch August 8, 2024 19:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants