-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Resolving inconsistency between attention/attention_bias #17587
base: main
Are you sure you want to change the base?
Conversation
@Cookiee235 please take a look when you have a chance, thank you! |
@tvm-bot rerun |
@tqchen Do you know why the CI testing is pending? |
@jikechao we are working on fixing the ci |
@tvm-bot rerun |
1 similar comment
@tvm-bot rerun |
@parsifal-47 Please fix the ci checks :) |
I am going to check that over the weekend, I see that |
Here is the error. You could run the command below :) pytest tests/python/relax/test_transform_allocate_workspace.py |
@tvm-bot rerun |
thank you for the reference, the test should be fixed now |
db175c7
to
211384e
Compare
211384e
to
60f934f
Compare
now it fails on |
This is a fix for:
#17486
Inconsistency between attention/attention_bias operations, after the fix, the script from #17486
is printing proper IR.
Thank you!