Skip to content

Fix TLV serialization to work with large types #964

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 24, 2021

Conversation

valentinewallace
Copy link
Contributor

@valentinewallace valentinewallace commented Jun 22, 2021

Previous to this PR, TLV serialization involved iterating from 0 to the highest
given TLV type. This worked until we decided to implement keysend, which has a
TLV type of ~5.48 billion.

So instead, we now specify the type of whatever is being (de)serialized (which
can be an Option, a Vec type, or "something else" (specified in the serialization macros as "required").

@codecov
Copy link

codecov bot commented Jun 22, 2021

Codecov Report

Merging #964 (40959b7) into main (bfd9646) will decrease coverage by 0.00%.
The diff coverage is 93.97%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #964      +/-   ##
==========================================
- Coverage   90.65%   90.65%   -0.01%     
==========================================
  Files          60       60              
  Lines       30407    30398       -9     
==========================================
- Hits        27566    27557       -9     
  Misses       2841     2841              
Impacted Files Coverage Δ
lightning/src/routing/router.rs 95.95% <0.00%> (ø)
lightning/src/util/events.rs 17.29% <33.33%> (ø)
lightning/src/util/ser_macros.rs 95.91% <96.15%> (-0.83%) ⬇️
lightning/src/chain/channelmonitor.rs 90.76% <100.00%> (-0.01%) ⬇️
lightning/src/chain/keysinterface.rs 95.41% <100.00%> (ø)
lightning/src/chain/onchaintx.rs 94.14% <100.00%> (ø)
lightning/src/chain/package.rs 92.27% <100.00%> (-0.02%) ⬇️
lightning/src/ln/chan_utils.rs 97.42% <100.00%> (-0.01%) ⬇️
lightning/src/ln/channel.rs 88.29% <100.00%> (ø)
lightning/src/ln/channelmanager.rs 83.85% <100.00%> (ø)
... and 3 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update bfd9646...40959b7. Read the comment docs.

Copy link
Contributor

@jkczyz jkczyz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nicely done! Wanted to drop a few comments before finding more time for a more thorough review.

Copy link
Collaborator

@TheBlueMatt TheBlueMatt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code basically looks good, I don't have an opinion on macro naming or overhauling the entire syntax (now that it's practical given we don't have strange cross-item ordering constraints).

Copy link
Collaborator

@TheBlueMatt TheBlueMatt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only nits.

BigSize(field.serialized_length() as u64).write($stream)?;
field.write($stream)?;
}
encode_tlv!($stream, $type, $field, $fieldty);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May be worth pulling this out into a separate repetition and making this repetition #[cfg(debug_assertions)]-enabled in its own scope. Release builds might be better optimized since last_seen wouldn't exist. I think the allow wouldn't be needed either.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this the implementation you were thinking of? https://i.imgur.com/G3g1BNz.png Seems it'll still need the allow

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Eh, if there's one thing I have confidence in LLVM for, it's optimizing away purely-on-stack variables which are written too and never read (and branches with no effect).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was thinking last_seen would go inside the second #[cfg(debug_assertions)] scope, leaving only one #[cfg(debug_assertions)]. But maybe it's still problematic?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was thinking last_seen would go inside the second #[cfg(debug_assertions)] scope, leaving only one #[cfg(debug_assertions)]. But maybe it's still problematic?

Ah, for some reason that was less obviously correct to me, but it does seem to be right according to -Z trace-macros. Still needs the allow but otherwise done

Copy link
Collaborator

@TheBlueMatt TheBlueMatt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ACK module two suggested tlv fixups. Just to check, @jkczyz you were fine with leaving syntax refactors to a followup (or not doing them), correct?

@jkczyz
Copy link
Contributor

jkczyz commented Jun 24, 2021

ACK module two suggested tlv fixups. Just to check, @jkczyz you were fine with leaving syntax refactors to a followup (or not doing them), correct?

Yup

Previous to this PR, TLV serialization involved iterating from 0 to the highest
given TLV type. This worked until we decided to implement keysend, which has a
TLV type of ~5.48 billion.

So instead, we now specify the type of whatever is being (de)serialized (which
can be an Option, a Vec type, or a non-Option (specified in the serialization macros as "required").
@TheBlueMatt TheBlueMatt merged commit b9aa71f into lightningdevkit:main Jun 24, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants