Commit 9a021b7
Bugfix: LLaMA layer norm incorrectly changes input type and consumers lots of memory (huggingface#23535)
* Fixed bug where LLaMA layer norm would change input type.
* make fix-copies
---------
Co-authored-by: younesbelkada <[email protected]>1 parent 72922ce commit 9a021b7
File tree
2 files changed
+4
-10
lines changed- src/transformers/models
- llama
- open_llama
2 files changed
+4
-10
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
81 | 81 | | |
82 | 82 | | |
83 | 83 | | |
| 84 | + | |
84 | 85 | | |
85 | 86 | | |
86 | 87 | | |
87 | | - | |
88 | | - | |
89 | | - | |
90 | | - | |
91 | | - | |
| 88 | + | |
92 | 89 | | |
93 | 90 | | |
94 | 91 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
91 | 91 | | |
92 | 92 | | |
93 | 93 | | |
| 94 | + | |
94 | 95 | | |
95 | 96 | | |
96 | 97 | | |
97 | | - | |
98 | | - | |
99 | | - | |
100 | | - | |
101 | | - | |
| 98 | + | |
102 | 99 | | |
103 | 100 | | |
104 | 101 | | |
| |||
0 commit comments