Skip to content

Commit d045702

Browse files
doc: use snowflake-arctic-embed2 instead of bge-m3 for ollama examples
Signed-off-by: thiswillbeyourgithub <[email protected]>
1 parent 60ef411 commit d045702

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

wdoc/docs/examples.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ wdoc --help
5050

5151
8. If you want to only use local models, here's an example with [ollama](https://ollama.com/):
5252
```bash
53-
wdoc --model="ollama/qwen3:8b" --query_eval_model="ollama/qwen3:8b" --embed_model="ollama/bge-m3" --task summarize --path https://situational-awareness.ai/
53+
wdoc --model="ollama/qwen3:8b" --query_eval_model="ollama/qwen3:8b" --embed_model="ollama/snowflake-arctic-embed2" --task summarize --path https://situational-awareness.ai/
5454
```
5555
You can always add `--private` to add additional safety nets that no data will leave your local network. You can also override specific API endpoints using
5656
```bash
@@ -162,7 +162,7 @@ wdoc --task=summary \
162162
```zsh
163163
wdoc --model="ollama/qwen3:8b" \
164164
--query_eval_model="ollama/qwen3:8b" \
165-
--embed_model="ollama/bge-m3" \
165+
--embed_model="ollama/snowflake-arctic-embed2" \
166166
--task summarize --path https://situational-awareness.ai/
167167
```
168168

@@ -173,7 +173,7 @@ wdoc --model="ollama/qwen3:8b" \
173173
--model_kwargs='{"max_tokens": 4096}' \
174174
--query_eval_model="ollama/qwen3:8b" \
175175
--query_eval_model_kwargs='{"max_tokens": 4096}' \
176-
--embed_model="ollama/bge-m3" \
176+
--embed_model="ollama/snowflake-arctic-embed2" \
177177
--task summarize --path https://situational-awareness.ai/
178178
```
179179

0 commit comments

Comments
 (0)