x/tools/gopls: duplicated analysis work #61508
Labels
FrozenDueToAge
gopls/performance
Issues related to gopls performance (CPU, memory, etc).
gopls
Issues related to the Go language server, gopls.
Tools
This label describes issues relating to any tools in the x/tools repository.
Milestone
The following log events were observed in a single run of "gopls check" (with staticcheck and nilness enabled) that was instrumented to report start/end calls to Analyze:
Observe that two concurrent Analyze calls request that the same package be analyzed with two different sets of analyzers.
The different sets arise from the two calls to source.Analyze, from diagnosePkg and codeAction, providing different values of the "includeConvenience" boolean, which varies the set. However, the facty subset of both sets is the same, so in principle they should be able to work synergistically.
Even if the flag is always true, the concurrent calls both create a DAG of analysisNodes for a batch of work, and execute it in parallel, and both encounter the slow ec2 package (1s, see #61506), so both make cache misses, even though they want the same computation. The only sharing is of completed analysis summaries via the file cache; there is no de-duplication of inflight requests (i.e. the first thread doesn't "lick the cookie"). This is problematic for large packages like ec2 because they are expensive to recompute, and more likely to be recomputed because their inflight duration is longer, increasing the odds of a concurrent request.
One solution would be to dedup using singleflight or a promise cache, but the logic could be tricky. Another would to somehow stagger the requests (codeAction and diagnosePkg) during initial indexing to make concurrent requests for large packages less likely; once the cache is populated subsequent operation is fast.
It's worth noting that this causes at worst a factor of 2x slowdown, which is a lot, but still much less than the current difference between actual and ideal performance of buildssa (see #61506).
The text was updated successfully, but these errors were encountered: