Skip to content

Update the docstring of untokenize() to match the docs #128519

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
tomasr8 opened this issue Jan 5, 2025 · 3 comments
Closed

Update the docstring of untokenize() to match the docs #128519

tomasr8 opened this issue Jan 5, 2025 · 3 comments
Assignees
Labels
docs Documentation in the Doc dir stdlib Python modules in the Lib dir topic-parser

Comments

@tomasr8
Copy link
Member

tomasr8 commented Jan 5, 2025

Documentation

In #128031, it was deemed too risky to change untokenize() to more closely match the original input string. We should update the docstring of untokenize to reflect that and prevent confusion in the (rare) case when untokenize does not match the input.

The docs already make that clear so we can simply reuse that explanation:

The reconstructed script is returned as a single string. The result is
guaranteed to tokenize back to match the input so that the conversion is
lossless and round-trips are assured. The guarantee applies only to the
token type and token string as the spacing between tokens (column
positions) may change.

Linked PRs

@tomasr8 tomasr8 added docs Documentation in the Doc dir stdlib Python modules in the Lib dir topic-parser labels Jan 5, 2025
@tomasr8 tomasr8 self-assigned this Jan 5, 2025
@erlend-aasland
Copy link
Contributor

BTW, it would have been perfectly fine to use #128031 for #128521, instead of creating a new issue.

erlend-aasland pushed a commit that referenced this issue Jan 6, 2025
erlend-aasland pushed a commit that referenced this issue Jan 6, 2025
@tomasr8
Copy link
Member Author

tomasr8 commented Jan 6, 2025

BTW, it would have been perfectly fine to use #128031 for #128521, instead of creating a new issue.

I thought since that issue was closed, it was better to make a new one.. Sorry for the noise!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs Documentation in the Doc dir stdlib Python modules in the Lib dir topic-parser
Projects
Status: Todo
Development

No branches or pull requests

2 participants