Skip to content

proposal: expand the go1 bench suite #20384

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
josharian opened this issue May 16, 2017 · 7 comments
Closed

proposal: expand the go1 bench suite #20384

josharian opened this issue May 16, 2017 · 7 comments

Comments

@josharian
Copy link
Contributor

[moved from #16192]

It's great to have a standard compiler bench suite. But the go1 suite is deep (fmt) and not particularly broad. Though it is slow and inconvenient to run lots of benchmarks, it is also very useful to have a large variety of code to validate compiler changes that have non-obvious trade-offs, like inlining and code layout.

I propose that we expand the go1 bench suite. I propose that we do it by pulling from real world code that people have filed performance bugs about, preferring medium-sized benchmarks. (That is, not so slow they only execute once in a second, but doing enough non-trivial work that they should be decently robust.) We would get IP permission and do just enough cleanup to make them fit the go1 mold. Two potential examples include #16192 and #16122.

We could do it ad hoc (just send a CL when we see a good candidate) or accumulate a list and occasionally evaluate it. If the latter, the list could be accumulated here or with GitHub labels. I don't feel strongly about either of those decisions; input welcome.

cc @davecheney @navytux

@gopherbot gopherbot added this to the Proposal milestone May 16, 2017
@quentinmit
Copy link
Contributor

I suggest x/benchmarks as a better home for these than the go1 benchmarks.

@josharian
Copy link
Contributor Author

Seems reasonable. Maybe x/benchmarks/compiler? Although if we have that, why do we also need the go1 benchmarks? Should we move them over to seed it?

@quentinmit
Copy link
Contributor

No. For benchmarks to be useful they need to not change over time. I think we should leave the go1 benchmarks frozen as they are and put new stuff in x/benchmarks.

@josharian
Copy link
Contributor Author

Then we will have two competing sets of benchmarks and two separate places I have to go to benchmark my compiler changes. There should be only one.

(In fact, see that we already have that--x/benchmarks has a build benchmark measuring toolchain performance, but we also have x/tools/cmd/compilebench.)

@joetaber
Copy link

For benchmarks to be useful they need to not change over time.

But "not change over time" is orthogonal to where they are stored. Move them or add to them, as long as they're not altered and the tooling only tries to compare benchmarks that were available both before and after it shouldn't be an issue.

That makes me think of something else. Lets say someone designs a new, very good benchmark (whatever that means). It might be valuable to have it run against previous versions of Go to compare performance of this benchmark historically. That becomes more difficult if the benchmarks are stored
stored in the same repo that is being checked out and built. For this reason it might be valuable to keep benchmarks in a separate repo.

@rsc
Copy link
Contributor

rsc commented Jun 12, 2017

If these are third-party-authored, please put them outside the main repo (x/benchmarks is fine). Also please hook into the existing x/benchmarks framework instead of making new standalone ones, so that all the standard data that x/benchmarks benchmarks report comes for free. But x/benchmarks is the place, probably.

go1 is fine for new smaller tests (roughly same size as the ones there) that are important and locally authored.

@rsc
Copy link
Contributor

rsc commented Jun 12, 2017

Closing since that should make clear where things go (for Josh's case, x/benchmarks).

@rsc rsc closed this as completed Jun 12, 2017
@golang golang locked and limited conversation to collaborators Jun 12, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

5 participants