Skip to content

Commit f4d973c

Browse files
authored
convert.py : fix llama/llama2 conversion due to vocab_size=-1 (#4258)
1 parent 954e228 commit f4d973c

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

convert.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -267,7 +267,7 @@ def loadOriginalParamsJson(model: LazyModel, config_path: Path) -> Params:
267267
n_ctx = 2048
268268

269269
return Params(
270-
n_vocab = config.get("vocab_size", model["tok_embeddings.weight"].shape[0]),
270+
n_vocab = model["tok_embeddings.weight"].shape[0],
271271
n_embd = config["dim"],
272272
n_layer = config["n_layers"],
273273
n_ctx = n_ctx,

0 commit comments

Comments
 (0)