-
Notifications
You must be signed in to change notification settings - Fork 476
Closed
Description
Description
Hey! The example in the README isn't working. I copied the whole code into a new console project, and I got an "Object reference is not set to an instance of an object" error. After debugging, I found the issue:
LLamaSharp/LLama/LLamaInteractExecutor.cs
Line 309 in 624c870
| var id = inferenceParams.SamplingPipeline.Sample(Context.NativeHandle, batch.TokenCount - 1); |
It seems that inferenceParams.SamplingPipeline is null.
This could be fixed by either setting an instance of DefaultSamplingPipeline to inferenceParams in the example, or by setting an instance of SamplingPipeline if it is null inside the InferInternal method, like this:
InferenceParams inferenceParams = new()
{
MaxTokens = 256, // No more than 256 tokens should appear in answer. Remove it if antiprompt is enough for control.
AntiPrompts = new List<string> { "User:" }, // Stop generation once antiprompts appear.
+ SamplingPipeline = new DefaultSamplingPipeline(),
};protected override async Task InferInternal(IInferenceParams inferenceParams, InferStateArgs args)
{
// ...
// Sample with the pipeline
+ inferenceParams.SamplingPipeline ??= new DefaultSamplingPipeline();
var id = inferenceParams.SamplingPipeline.Sample(Context.NativeHandle, batch.TokenCount - 1);
// ...
}I can make a PR if this sounds good to you.
Metadata
Metadata
Assignees
Labels
No labels