Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: internal to MongoDB $jsonSchema conversion COMPASS-8701 #218

Merged

Conversation

paula-stacho
Copy link
Contributor

@paula-stacho paula-stacho commented Jan 22, 2025

While the internal format is very verbose, for $jsonSchema we keep the basics:

@paula-stacho paula-stacho marked this pull request as ready for review January 22, 2025 12:09
src/schema-convertors/internalToMongoDB.ts Outdated Show resolved Hide resolved
const required = [];
const properties: MongoDBJSONSchema['properties'] = {};
for (const field of fields) {
if (signal?.aborted) throw new Error('Operation aborted');
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is synchronous code, so there's no way for an abort to happen between multiple iterations of this loop, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, the code is synchronous. I'm not super sure about this abort, the concern was that there might be some extremely nested documents, but most likely that kind of thing would become a problem already at the analysis, so having the abort there is probably enough. What do you think @Anemy ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What Anna is getting at is that once this code starts, we're not releasing the javascript thread to do any other work, so we can't have the signal abort except for it already being aborted. To have the functions be cancellable we'll want to yield occasionally, so wrapping these in async and awaiting a promise somewhere inside.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To have the functions be cancellable we'll want to yield occasionally, so wrapping these in async and awaiting a promise somewhere inside.

Fwiw that still wouldn't end up being interruptible by anything other than microtasks – e.g. a user clicking an "abort" button would still never have any effect

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would a solution where we occasionally run a set timeout work nicely? Or do you feel we should consider moving this to a worker worker? I feel a worker will add some complexity to this code to something which typically runs pretty quick most of the time, but does sound like the more elegant solution.

if (yieldCounter > 100) {
  yieldCounter = 0;
  await new Promise((resolve) => setTimeout(resolve));
}
yieldCounter++;

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, worker would be nice, but we'd have to be careful about not breaking webpack in downstream packages or running in browser environments, which is probably where a lot of the complexity would lie

setTimeout without a timeout sounds like a decent approach that's a lot simpler, yeah

Copy link
Contributor Author

@paula-stacho paula-stacho Jan 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you both, turns out I didn't actually understand how this works. I've tried the approach with setTimeout

src/schema-convertors/index.ts Show resolved Hide resolved
]
}
]
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not something for this PR, but do we plan to have integration tests at some point, e.g. ones that take documents, generate schemas for them, then add a validator to a real MongoDB database + try to insert those documents?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great idea! I was planning something similar in the E2E tests in Compass, but having such integration tests in this package definitely makes sense. I will add a ticket.

@paula-stacho paula-stacho merged commit 530563f into COMPASS-6862-schema-export-multiple-formats Jan 29, 2025
18 checks passed
@paula-stacho paula-stacho deleted the COMPASS-8701-2 branch January 29, 2025 11:45
paula-stacho added a commit that referenced this pull request Feb 6, 2025
…S-8702 COMPASS-8709 (#222)

* feat: add analyzeDocuments + SchemaAccessor COMPASS-8799 (#216)



---------

Co-authored-by: Anna Henningsen <[email protected]>

* feat: internal to MongoDB $jsonSchema conversion COMPASS-8701 (#218)


---------

Co-authored-by: Anna Henningsen <[email protected]>

* feat: internal to Standard JSON Schema conversion COMPASS-8700 (#219)

* feat: internal to Expanded JSON Schema conversion COMPASS-8702  (#220)

---------

Co-authored-by: Anna Henningsen <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants