-
Notifications
You must be signed in to change notification settings - Fork 162
Description
Summary
The Metrics class in packages/metrics/src/Metrics.ts
uses Object.assign(acc, dims)
in a reducer pattern when merging dimension sets in the serializeMetrics()
method (line 768). This creates O(n²) time complexity because Object.assign
copies all properties from the accumulator on each iteration, leading to exponential copying as the number of dimension sets increases.
Why is this needed?
This performance issue can cause slowdowns when users add multiple dimension sets to their metrics, especially in high-throughput Lambda functions. The current implementation violates the coding guidelines that recommend avoiding Object.assign
in accumulators and using for...of
loops instead. Additionally, there's already a biome lint rule being ignored (// biome-ignore lint/performance/noAccumulatingSpread
) which was added during the last refactor.
Which area does this relate to?
Metrics
Solution
Replace the Object.assign(acc, dims)
call with a for...of
loop that directly mutates the accumulator:
// Current (O(n²)):
...this.dimensionSets.reduce((acc, dims) => Object.assign(acc, dims), {})
// Proposed (O(n)):
...this.dimensionSets.reduce((acc, dims) => {
for (const [key, value] of Object.entries(dims)) {
acc[key] = value;
}
return acc;
}, {} as Dimensions)
This maintains identical behavior while improving performance and following the established coding guidelines.
Acknowledgment
- This request meets Powertools for AWS Lambda (TypeScript) Tenets
- Should this be considered in other Powertools for AWS Lambda languages? i.e. Python, Java, and .NET
Future readers: Please react with 👍 and your use case to help us understand customer demand.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status