File tree 1 file changed +4
-10
lines changed 1 file changed +4
-10
lines changed Original file line number Diff line number Diff line change @@ -320,16 +320,10 @@ def untokenize(iterable):
320
320
with at least two elements, a token number and token value. If
321
321
only two tokens are passed, the resulting output is poor.
322
322
323
- Round-trip invariant for full input:
324
- Untokenized source will match input source exactly
325
-
326
- Round-trip invariant for limited input:
327
- # Output bytes will tokenize back to the input
328
- t1 = [tok[:2] for tok in tokenize(f.readline)]
329
- newcode = untokenize(t1)
330
- readline = BytesIO(newcode).readline
331
- t2 = [tok[:2] for tok in tokenize(readline)]
332
- assert t1 == t2
323
+ The result is guaranteed to tokenize back to match the input so
324
+ that the conversion is lossless and round-trips are assured.
325
+ The guarantee applies only to the token type and token string as
326
+ the spacing between tokens (column positions) may change.
333
327
"""
334
328
ut = Untokenizer ()
335
329
out = ut .untokenize (iterable )
You can’t perform that action at this time.
0 commit comments