Skip to content

[Bug]: Error generating from LLM: temperature (=0.0) has to be a strictly positive float, otherwise your next token scores will be invalid. If you're looking for greedy decoding strategies, set do_sample=False. #959

@komorebi6666

Description

@komorebi6666

I use the model on Hugging Face; my config.json file settings for the model are as follows:: "model": {
"provider": "huggingface_pipeline",
"name": "refuelai/Llama-3-Refueled"
}
I executed the following command::

from autolabel import LabelingAgent, AutolabelDataset
agent = LabelingAgent(config='config.json')
ds = AutolabelDataset('movie_reviews.csv', config = agent.config)
agent.plan(ds)
ds = agent.run(ds)
ds.save('movie_reviews_labeled.csv')

出现问题:
Setting pad_token_id to eos_token_id:128001 for open-end generation.
Setting pad_token_id to eos_token_id:128001 for open-end generation.
Error generating from LLM: temperature (=0.0) has to be a strictly positive float, otherwise your next token scores will be invalid. If you're looking for greedy decoding strategies, set do_sample=False.
━━━━━╸━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1/7 0:00:00 -:--:--
Cost in $=0.00Setting pad_token_id to eos_token_id:128001 for open-end generation.

How do I set the corresponding parameters in the config.json file to successfully resolve this issue?

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions