-
Notifications
You must be signed in to change notification settings - Fork 329
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bump various ops to opset 22, adding bf16 support #3059
base: main
Are you sure you want to change the base?
Conversation
Bumped ops: Acos Acosh Asin Asinh Atan Atanh AveragePool Bernoulli Conv ConvTranspose Cos Cosh DeformConv Det Dropout Elu EyeLike GRU GlobalAveragePool GlobalMaxPool HardSigmoid HardSwish InstanceNormalization LSTM LpNormalization LpPool MaxPool MaxRoiPool MaxUnpool Mish Multinomial NegativeLogLikelihoodLoss RNN RandomNormal RandomNormalLike RandomUniform RandomUniformLike RoiAlign Round Selu Sin Sinh Softplus Softsign Tan ThresholdedRelu Signed-off-by: Rickert, Jonas <[email protected]>
@jorickert maybe there is some confusion on how to update these files but what was done is incorrect. We would not simply replace the opset version with the latest one although support is given for it. How the files would be update would be shown below in this example:
Here for Clip we specify the minimum opset version we support along with any of the newer ones. So if We might need to check that the operations also support version 20 and 21 if there was like a data type change. |
This is different from how I thought the versioning works. Do really all versions that are supported need to be listed, or only if there is a breaking change between them? All the changes in this PR are not braking and only extend the type support |
I tested it, and with this PR I can successful parse both Acos Opset 7 as well as Acos Opset 22. I will add a test for this |
Signed-off-by: Rickert, Jonas <[email protected]>
The way @jorickert implemented this is correct. Because opset 22 only adds a new type the ops are backwards compatible. This LGTM |
@jorickert I would default to @AlexandreEichenberger or @chentong319. Regardless if the ops are backwards compatible, I want to make sure the precedent is followed by the maintainers of the community. FYI-@philass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am blocking until the maintainers review :)
Bumped ops:
Acos
Acosh
Asin
Asinh
Atan
Atanh
AveragePool
Bernoulli
Conv
ConvTranspose
Cos
Cosh
DeformConv
Det
Dropout
Elu
EyeLike
GRU
GlobalAveragePool
GlobalMaxPool
HardSigmoid
HardSwish
InstanceNormalization
LSTM
LpNormalization
LpPool
MaxPool
MaxRoiPool
MaxUnpool
Mish
Multinomial
NegativeLogLikelihoodLoss
RNN
RandomNormal
RandomNormalLike
RandomUniform
RandomUniformLike
RoiAlign
Round
Selu
Sin
Sinh
Softplus
Softsign
Tan
ThresholdedRelu