Skip to content

Conversation

@rakkit
Copy link

@rakkit rakkit commented Oct 24, 2025

as titled. the example here uses some outdated TorchTitan APIs. This PR fixes them and aligns the example with other torchtitan end-to-end usages.

# usage for test fp8_rowwise (default use fp8_rowwise)
python ./torchao/prototype/moe_training/examples/simple_moe_layer.py 
# usage for test fp8_rowwise
python ./torchao/prototype/moe_training/examples/simple_moe_layer.py  --scaling_type fp8_rowwise

# usage for test mxfp8
python ./torchao/prototype/moe_training/examples/simple_moe_layer.py  --scaling_type mxfp8

results:
image

@pytorch-bot
Copy link

pytorch-bot bot commented Oct 24, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/3242

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure

As of commit bf18374 with merge base 03c2d28 (image):

NEW FAILURE - The following job has failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 24, 2025
@rakkit rakkit force-pushed the fix_moe_training_examples branch from e4eb126 to a7e623b Compare October 24, 2025 17:07
@vkuzo vkuzo requested a review from danielvegamyhre October 27, 2025 13:19
Copy link
Contributor

@danielvegamyhre danielvegamyhre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for updating this! Did you confirm it is successfully runnable via copy/paste?

@rakkit rakkit force-pushed the fix_moe_training_examples branch from d08f3b3 to bc2517e Compare October 27, 2025 18:21
@rakkit
Copy link
Author

rakkit commented Oct 27, 2025

I made a few changes to set the random seed and allow it to pass scaling_type from CLI.

@rakkit
Copy link
Author

rakkit commented Oct 27, 2025

@danielvegamyhre yes, the outpout should be

> python ./torchao/prototype/moe_training/examples/simple_moe_layer.py 
step 0 loss: 2656.0
step 1 loss: 2624.0
step 2 loss: 2592.0
step 3 loss: 2560.0
step 4 loss: 2528.0
step 5 loss: 2512.0
step 6 loss: 2480.0
step 7 loss: 2448.0
step 8 loss: 2432.0
step 9 loss: 2416.0

i have already revised the PR message and put the screenshoot there

@rakkit rakkit force-pushed the fix_moe_training_examples branch from bc2517e to bf18374 Compare October 27, 2025 22:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants