You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I do realtime learning with libtorch where the gradient is calculated elsewhere and would like to translate that to executorch: https://github.com/berndporr/dnf_torch this is realtime noise cancellations but I have also other applications where the gradient comes further afield so really not associated with a simple subtraction operation as I also do RL. In libtorch I can directly inject the gradient and then do the backwards pass: https://github.com/berndporr/dnf_torch/blob/main/dnf_torch.cpp#L99
However, in executorch forward/backward is in one go. How do I go about this?
Looking at the XOR example I could inject the gradient as an additional input? Roughly:
class TrainingNet(nn.Module):
def __init__(self, net):
super().__init__()
self.net = net
def forward(self, gradient, input):
pred = self.net(input)
return gradient, pred.detach().argmax(dim=1)
and then lower that and inject the gradient as the first input. Would that be the way to go?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I do realtime learning with libtorch where the gradient is calculated elsewhere and would like to translate that to executorch: https://github.com/berndporr/dnf_torch this is realtime noise cancellations but I have also other applications where the gradient comes further afield so really not associated with a simple subtraction operation as I also do RL. In libtorch I can directly inject the gradient and then do the backwards pass: https://github.com/berndporr/dnf_torch/blob/main/dnf_torch.cpp#L99
However, in executorch forward/backward is in one go. How do I go about this?
Looking at the XOR example I could inject the gradient as an additional input? Roughly:
and then lower that and inject the gradient as the first input. Would that be the way to go?
Beta Was this translation helpful? Give feedback.
All reactions