-
Notifications
You must be signed in to change notification settings - Fork 681
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hardswish example #426
Comments
I think this may be incorrect for the |
@samiit Hi , Did your I am facing the same problem about Sorry to bother you ~~ |
@ntut108318099 - I took the lazy way out and just wrote the Hardswish implementation in Pytorch and then replaced all the activations in the model with my version and it then converts just fine: # Swish, HardSigmoid and HardSwish are adapted from:
# https://github.com/Randl/MobileNetV3-pytorch/blob/master/MobileNetV3.py
def swish(x: Tensor) -> Tensor:
return x * x.sigmoid()
def hard_sigmoid(x: Tensor, inplace: bool = False) -> Tensor:
return F.relu6(x + 3, inplace) / 6
def hard_swish(x: Tensor, inplace: bool = False) -> Tensor:
return x * hard_sigmoid(x, inplace)
class Hardswish2(nn.Module):
def __init__(self, inplace=False):
super(Hardswish2, self).__init__()
self.inplace = inplace
def forward(self, x):
return hard_swish(x, inplace=self.inplace) |
@HamsterHuey Thanks for your reply. That’s amazing idea ! Thanks for your supply . |
@ntut108318099 - I just copied that code from the Mobilenet repo into my own. Basically you define Hardswish activation in Pytorch primitives. Because the HardSwish here is defined in terms of
|
@HamsterHuey Hi, I followed your tutorial,and it worked !! Have a nice day . |
@ntut108318099 Hi! Could you please share your solution because my implementation returns nan? |
Can anyone kindly comment whether my implementation of hardswish is correct? I can well imagine that it is not the most efficient, but the best I could get to.
Thanks for the great repo!
Sam
The text was updated successfully, but these errors were encountered: