-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Closed
Labels
status: trackingTracking work in progressTracking work in progress
Description
Overview
The OpenAI chat completion API currently supports a number of parameters that are not currently supported in the mlc-llm REST API, such as temperature, stop sequences, etc. This issue tracks adding support for these parameters, so that they can be used by downstream dependencies of the REST API, such as Langchain integration.
Note that some parameters may require changes in the underlying chat module or llm_chat
.
Action Items
-
temperature
-
top_p
-
n
-
stop
-
max_tokens
-
presence_penalty
-
frequency_penalty
See the OpenAI API Reference for details on each of these parameters.
Links to Related Issues and PRs
Hzfengsy, geekboood, Jchang4, Sing-Li, junrushao and 1 more
Metadata
Metadata
Assignees
Labels
status: trackingTracking work in progressTracking work in progress
Type
Projects
Status
Done