Skip to content

Conversation

zewenli98
Copy link
Collaborator

Description

Refactored layer norm converter with TRT INormalization Layer

Fixes #2730

Type of change

  • Bug fix (non-breaking change which fixes an issue)

Checklist:

  • My code follows the style guidelines of this project (You can use the linters)
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas and hacks
  • I have made corresponding changes to the documentation
  • I have added tests to verify my fix or my feature
  • New and existing unit tests pass locally with my changes
  • I have added the relevant labels to my PR in so that relevant reviewers are notified

@zewenli98 zewenli98 added the component: converters Issues re: Specific op converters label Apr 16, 2024
@zewenli98 zewenli98 self-assigned this Apr 16, 2024
@github-actions github-actions bot added component: tests Issues re: Tests component: conversion Issues re: Conversion stage component: api [Python] Issues re: Python API component: dynamo Issues relating to the `torch.compile` or `torch._dynamo.export` paths labels Apr 16, 2024
Copy link
Collaborator

@narendasan narendasan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just needs a rebase

@zewenli98 zewenli98 force-pushed the refactor_layer_norm_dynamo_converter branch from 4e47a38 to fbed329 Compare April 26, 2024 22:52
@zewenli98 zewenli98 merged commit 1a4ffe4 into main Apr 26, 2024
@zewenli98 zewenli98 deleted the refactor_layer_norm_dynamo_converter branch April 26, 2024 23:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed component: api [Python] Issues re: Python API component: conversion Issues re: Conversion stage component: converters Issues re: Specific op converters component: dynamo Issues relating to the `torch.compile` or `torch._dynamo.export` paths component: tests Issues re: Tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

❓ [Question] Running LayerNorm in fp16
3 participants