-
Notifications
You must be signed in to change notification settings - Fork 12.8k
Better reuse of package.json cache, module resolution cache, and package.json auto import filter #47388
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Better reuse of package.json cache, module resolution cache, and package.json auto import filter #47388
Conversation
Some very basic questions, since this isn't really my area:
|
|
Yeah I don't think the cached json is going to be a problem before the code itself is a problem given normal ratios of the two. |
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems reasonable.
(4) is because there’s simply no new information to write back to the cache. In order to have files available for auto-import in the first place, module resolution already had to find them and read their package.jsons. Theoretically we shouldn’t hit one we haven’t already seen. Updating a cache that belongs to module resolution outside of module resolution does feel sketchy, and I would probably avoid it. But in this case there’s nothing to update. |
I made a test project that contained a lot of dependencies with export maps to ensure the new code paths were being hit. The slowdown from main to #47092 can largely be explained by the fact that we’re finding more things to auto-import, as the baseline case completely ignored those export maps. The cache reuse pays off in
collectAutoImports
, where previously we had to read package.jsons that have already been read and cached. Note thatgetExportInfoMap
andcollectAutoImports
are components ofcompletionInfo
, which is the “bottom line” perf number for the server-side work of getting completions.