Skip to content

Cleanup Enums #1201

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Jan 18, 2018
Merged

Cleanup Enums #1201

merged 5 commits into from
Jan 18, 2018

Conversation

IvanGoncharov
Copy link
Member

I converted Kind into an object and created KindEnum and TokenKindEnum types.
To do this I wrapped enums into Object.freeze as required by Flow.
It's safe to use Object.freeze since it supported by all browsers supported by this lib:
image

If for some reason Object.freeze can't be used in this lib I can wrap it into code comment as suggested here: facebook/flow#5465 (comment)

@leebyron
Copy link
Contributor

A couple thoughts:

First, do you have any performance metrics for this change? This replaces a local const with a dictionary lookup for kind in the parser and lexer and replaces a const int with a string length calculation in the lexer. I think it's important we ensure this isn't a parse time regression.

Second, could you explain the impact of this change on the use of the library? Aside from some extra verbosity cost of replacing XYZ with Kind.XYZ everywhere, this seems reasonable - though I'm not sure I understand if this has non-internal implications

@leebyron
Copy link
Contributor

I've also heard reports that Object.freeze() can result in slower object reads - do you know about this and if those reports apply to this use case?

@IvanGoncharov
Copy link
Member Author

IvanGoncharov commented Jan 13, 2018

Second, could you explain the impact of this change on the use of the library? Aside from some extra verbosity cost of replacing XYZ with Kind.XYZ everywhere, this seems reasonable - though I'm not sure I understand if this has non-internal implications

I look at different parts of graphql-js almost every day for two reasons:

  • to understand certain aspects of GraphQL
  • how to reuse code from graphql-js so I don't need to reimplement it in my tools

And if I see inconsistency I can’t understand if it it intentional or for historical reasons. In this case, I noticed recently added directiveLocation.js and was wondering why other enums are defined differently.

You are absolutely right that this PR doesn't change anything for library consumers but I think such changes improve quality of "reference implementation".

I've also heard reports that Object.freeze() can result in slower object reads - do you know about this and if those reports apply to this use case?

The only thing I could find is reports that applying Object.freeze on prototype of built-in types (Array, Object,...) were preventing Chrome from using built-in implementations of their methods.
It's not affecting this PR and is already fixed, here is a great article explaining the details:
http://benediktmeurer.de/2017/08/10/frozen-prototypes/

First, do you have any performance metrics for this change?

Not initially 😞
But I already added lexer benchmark into #1167

This replaces a local const with a dictionary lookup for kind in the parser and lexer

I profiled it both with #1167 and node profiler and haven’t noticed any changes.

const int with a string length calculation in the lexer

I haven’t measured any difference between using const int or a string length.
Although, the overall change did cause performance drop ~4% but only because I defined literalTok inside the function.

But when I moved literalTok in the global scope (e9098c2) it actually resulted in ~7% performance increase against the current master and I also was able to measure a few percent increase in the parser benchmark.
It surprised me a lot so I run profiler (master on the left and literalTok with no other changes is on the right):
profile

So it looks like V8 inlined both build-in for new and Tok constructor inside literalTok.
I think it happened because V8 was able to figure out 100% types of literalTok:

[marking  <JSFunction literalTok (sfi = )> for optimized recompilation, reason: small function, ICs with typeinfo: 3/3 (100%), generic ICs: 0/3 (0%)]
[marking  <JSFunction readToken (sfi = )> for optimized recompilation, reason: hot and stable, ICs with typeinfo: 126/140 (90%), generic ICs: 0/140 (0%)]

So my speculation is that ICs with typeinfo: 3/3 (100%) is responcible for this boost.

Enum values are best left as representative opaque values, so we should avoid `value.length`. Minor variable name changes to preserve existing formatting
@leebyron
Copy link
Contributor

Thanks for the analysis, @IvanGoncharov - this looks great

@leebyron leebyron merged commit 17a0bfd into graphql:master Jan 18, 2018
@IvanGoncharov
Copy link
Member Author

Enum values are best left as representative opaque values, so we should avoid value.length.

Agree 👍
Thanks for merging.

@IvanGoncharov IvanGoncharov deleted the parser-cleanup branch January 23, 2018 11:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants