Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Gelu activation function #165

Closed
6 tasks
zaleslaw opened this issue Jul 29, 2021 · 1 comment · Fixed by #187
Closed
6 tasks

Add Gelu activation function #165

zaleslaw opened this issue Jul 29, 2021 · 1 comment · Fixed by #187
Assignees
Labels
good first issue Good for newcomers
Milestone

Comments

@zaleslaw
Copy link
Collaborator

zaleslaw commented Jul 29, 2021

The Gelu activation function is missed in KotlinDL.

The desired PR addressing this issue should include:

  • Implementation of activation class named as GeluActivation (you can take inspiration from the implementation of HardSigmoid as reference) added to the Activations.kt file
  • Add an approximate version of Gelu
  • Documentation of activation function
  • JUnit tests in api module
  • Support for export of activation function to JSON (see ModelSaver.kt)
  • Support for import of activation function from JSON (see ModelLoader.kt)

P.S. If you want to take this ticket, please leave the comment below
P.P.S Read the Contributing Guidelines.

Read more about Gaussian Error Linear Units (GELUs)
and BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

The reference implementation could be taken from tensorflow-addons

@zaleslaw zaleslaw added the good first issue Good for newcomers label Jul 29, 2021
@zaleslaw zaleslaw added this to the 0.3 milestone Jul 29, 2021
@therealansh
Copy link
Contributor

@zaleslaw I'll start working on this next up.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants