You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Gelu activation function is missed in KotlinDL.
The desired PR addressing this issue should include:
Implementation of activation class named as GeluActivation (you can take inspiration from the implementation of HardSigmoid as reference) added to the Activations.kt file
Add an approximate version of Gelu
Documentation of activation function
JUnit tests in api module
Support for export of activation function to JSON (see ModelSaver.kt)
Support for import of activation function from JSON (see ModelLoader.kt)
P.S. If you want to take this ticket, please leave the comment below
P.P.S Read the Contributing Guidelines.
The Gelu activation function is missed in KotlinDL.
The desired PR addressing this issue should include:
GeluActivation
(you can take inspiration from the implementation ofHardSigmoid
as reference) added to theActivations.kt
fileapi
moduleP.S. If you want to take this ticket, please leave the comment below
P.P.S Read the Contributing Guidelines.
Read more about Gaussian Error Linear Units (GELUs)
and BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
The reference implementation could be taken from tensorflow-addons
The text was updated successfully, but these errors were encountered: