Skip to content

Commit c67e501

Browse files
authored
docs: fix the Palm2TextGenerator output token size (#649)
Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly: - [ ] Make sure to open an issue as a [bug/issue](https://togithub.com/googleapis/python-bigquery-dataframes/issues/new/choose) before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea - [ ] Ensure the tests and linter pass - [ ] Code coverage does not decrease (if any source code was changed) - [ ] Appropriate docs were updated (if necessary) Fixes internal #333480290 🦕
1 parent 9963f85 commit c67e501

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

bigframes/ml/llm.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -233,7 +233,7 @@ def predict(
233233
max_output_tokens (int, default 128):
234234
Maximum number of tokens that can be generated in the response. Specify a lower value for shorter responses and a higher value for longer responses.
235235
A token may be smaller than a word. A token is approximately four characters. 100 tokens correspond to roughly 60-80 words.
236-
Default 128. For the 'text-bison' model, possible values are in the range [1, 1024]. For the 'text-bison-32k' model, possible values are in the range [1, 8196].
236+
Default 128. For the 'text-bison' model, possible values are in the range [1, 1024]. For the 'text-bison-32k' model, possible values are in the range [1, 8192].
237237
Please ensure that the specified value for max_output_tokens is within the appropriate range for the model being used.
238238
239239
top_k (int, default 40):
@@ -269,10 +269,10 @@ def predict(
269269

270270
if (
271271
self.model_name == _TEXT_GENERATOR_BISON_32K_ENDPOINT
272-
and max_output_tokens not in range(1, 8197)
272+
and max_output_tokens not in range(1, 8193)
273273
):
274274
raise ValueError(
275-
f"max_output_token must be [1, 8196] for TextBison 32k model, but is {max_output_tokens}."
275+
f"max_output_token must be [1, 8192] for TextBison 32k model, but is {max_output_tokens}."
276276
)
277277

278278
if top_k not in range(1, 41):

0 commit comments

Comments
 (0)