Skip to content
This repository was archived by the owner on Apr 23, 2025. It is now read-only.

Conversation

inukshukdeveloper
Copy link
Contributor

This PR proposes to add some functionality to checkpoint writing for GPT2. One of the layers appeared to have the wrong mapping ("weight" vs "embedding") and another layer in the model was not being written (MultiheadAttention vs MultiheadAttentionGPT2). Output from the modified checkpoint file differed slightly from the checkpoint stored remotely by a few bytes so it is possible some additional information is being written by this change versus the remotely stored version.

All checkpoint writing tests passed successfully. Another test could be added to exercise the writing of auxiliary files if desired.

It's known that new protocols are in process (PR #631) for checkpoint writing so this change is more of a historical completion PR for the pattern first established by the GPT2 model. Hopefully, it will be useful for anyone looking at previous examples of checkpoint writing done in the past.

The work was performed on the 0.10 branch due to some compilation or configuration issues on master. A fallback to 0.10 allowed the work to proceed but hopefully it can be merged down if the change is acceptable.

saeta and others added 2 commits July 21, 2020 08:21
…rs were missing in the writing process. Also added copy facility to the FileSystem protocol to copy the auxiliary files to the checkpoint location.
Copy link
Contributor

@BradLarson BradLarson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the fix. The GPT-2 naming changes were incorporated into the more significant rework I was doing in PR #631, but it's good to have these fixed while I'm working on that. Also, copying the metadata into the new checkpoint location is a good idea, thanks for suggesting that.

@BradLarson BradLarson merged commit 61279b9 into tensorflow:master Aug 19, 2020
@inukshukdeveloper inukshukdeveloper deleted the GPT2_Checkpoint_Additions branch August 19, 2020 20:47
@inukshukdeveloper
Copy link
Contributor Author

Thanks for the review. Deleted the branch. I'll delete the fork shortly. I believe I have some outstanding work still on it that I need to move to another branch.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants