Replies: 1 comment
-
This feature is not currently available: see #140 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How to use djl to write a custom function which implments its forward and backward. That means the gradient compuation is not based on the auto-differentiation and it is compuated manually. In PyToch, we can do this like following, is that possible to do similar thing in dji.
class MyReLU(torch.autograd.Function):
"""
We can implement our own custom autograd Functions by subclassing
torch.autograd.Function and implementing the forward and backward passes
which operate on Tensors.
"""
Beta Was this translation helpful? Give feedback.
All reactions