-
Notifications
You must be signed in to change notification settings - Fork 164
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add unsafety test for consumers of salsa macros #692
base: master
Are you sure you want to change the base?
Add unsafety test for consumers of salsa macros #692
Conversation
✅ Deploy Preview for salsa-rs canceled.
|
Thanks. I like the idea of testing that the salsa macros don't emit any errors. i'd prefer if we could change our existing test to catch all clippy errors (including pedantic and restriction) over introducing a new test just for this. See here It would also be great if we could fix the macros. Would you be interested in adding a fix in your PR? Edit: #691 might have removed the |
@MichaReiser Thanks for taking a look!
That sounds super reasonable to me, I'll take a look at whether I can at least come up with a failing test in that style. Here's what I would've imagined based on your phrasing, if you see any adjustments to make:
My personal clippy configs are generally super pessimistic so I recognize that I may not be calibrated the same as any of the maintainers for what actually constitutes actionable problems.
Not sure if this is what you're asking, but I tried to mention explicitly in the original PR description that I don't feel skill-capable of providing such a fix myself. I've made no significant work with There was also always the possibility that you all feel that judicious use of unsafe was warranted and it was my expectations that needed adjusting, so I thought better to ask first. |
I don't think the goal is to forbid the use of any unsafe code in salsa or its macros. Instead, we should mark the code as "generated" and allow specific lint rules for code that violates them (and where it isn't possible to fix, e.g. if it is needed to use unsafe code). Having said that. A failing test for all clippy and standard lints would be a great starting point! |
After I updated a personal project to a Salsa commit that included #680, I started receiving compiler errors due to my project-local use of
#![forbid(unsafe_code)]
together with thesalsa::tracked
derive macro. I wanted to see if this was expected due to chosen trade-offs, or if it's a quality of the earlier versions of Salsa that might be preserved.The test included in this PR fails as of current master and on the first commit of #680, but passes on that commit's parent. I mostly created this PR to be a place for discussion without polluting the comments of a merged PR, but I don't personally have (or need) a vote on what the project chooses to do with this information. I just thought it might save a tiny bit of legwork and contextualize the change that surprised me - I did not expect my own local crate to newly contain
unsafe
via the derive macros.If it's desired to permanently incorporate this test to help catch future regressions, I'm happy to make any changes to this to line up with desired content, commit message style, etc. but I'm not personally capable of designing or implementing a no-unsafe version of the intent in #680.
If preferred, I'm happy to close this and migrate the concern to an issue instead.
cargo test --no-run --test downstream_unsafe
: