Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions docs/api/models/grok.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# `pydantic_ai.models.grok`

## Setup

For details on how to set up authentication with this model, see [model configuration for Grok](../../models/grok.md).

::: pydantic_ai.models.grok
1 change: 1 addition & 0 deletions docs/models/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ Pydantic AI is model-agnostic and has built-in support for multiple model provid
* [OpenAI](openai.md)
* [Anthropic](anthropic.md)
* [Gemini](google.md) (via two different APIs: Generative Language API and VertexAI API)
* [Grok](grok.md)
* [Groq](groq.md)
* [Mistral](mistral.md)
* [Cohere](cohere.md)
Expand Down
73 changes: 73 additions & 0 deletions docs/models/xai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
# xAI

## Install

To use [`XaiModel`][pydantic_ai.models.xai.XaiModel], you need to either install `pydantic-ai`, or install `pydantic-ai-slim` with the `xai` optional group:

```bash
pip/uv-add "pydantic-ai-slim[xai]"
```

## Configuration

To use xAI models from [xAI](https://x.ai/api) through their API, go to [console.x.ai](https://console.x.ai/team/default/api-keys) to create an API key.

[`GrokModelName`][pydantic_ai.providers.grok.GrokModelName] contains a list of available xAI models.

## Environment variable

Once you have the API key, you can set it as an environment variable:

```bash
export XAI_API_KEY='your-api-key'
```

You can then use [`XaiModel`][pydantic_ai.models.xai.XaiModel] by name:

```python
from pydantic_ai import Agent

agent = Agent('xai:grok-4-1-fast-non-reasoning')
...
```

Or initialise the model directly:

```python
from pydantic_ai import Agent
from pydantic_ai.models.xai import XaiModel

# Uses XAI_API_KEY environment variable
model = XaiModel('grok-4-1-fast-non-reasoning')
agent = Agent(model)
...
```

You can also customize the [`XaiModel`][pydantic_ai.models.xai.XaiModel] with a custom provider:

```python
from pydantic_ai import Agent
from pydantic_ai.models.xai import XaiModel
from pydantic_ai.providers.xai import XaiProvider

# Custom API key
provider = XaiProvider(api_key='your-api-key')
model = XaiModel('grok-4-1-fast-non-reasoning', provider=provider)
agent = Agent(model)
...
```

Or with a custom `xai_sdk.AsyncClient`:

```python
from xai_sdk import AsyncClient
from pydantic_ai import Agent
from pydantic_ai.models.xai import XaiModel
from pydantic_ai.providers.xai import XaiProvider

xai_client = AsyncClient(api_key='your-api-key')
provider = XaiProvider(xai_client=xai_client)
model = XaiModel('grok-4-1-fast-non-reasoning', provider=provider)
agent = Agent(model)
...
```
88 changes: 88 additions & 0 deletions examples/pydantic_ai_examples/stock_analysis_agent.py
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we're going to include this example, we should have a doc under docs/examples as well. I'm also OK with not including this example.

Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
"""Example of using Grok's server-side web_search tool.

This agent:
1. Uses web_search to find the hottest performing stock yesterday
2. Provides buy analysis for the user
"""

import logfire
from pydantic import BaseModel, Field

from pydantic_ai import (
Agent,
BuiltinToolCallPart,
WebSearchTool,
)
from pydantic_ai.models.xai import XaiModel

logfire.configure()
logfire.instrument_pydantic_ai()

# Configure for xAI API - XAI_API_KEY environment variable is required
# The model will automatically use XaiProvider with the API key from the environment

# Create the model using XaiModel with server-side tools
model = XaiModel('grok-4-fast')


class StockAnalysis(BaseModel):
"""Analysis of top performing stock."""

stock_symbol: str = Field(description='Stock ticker symbol')
current_price: float = Field(description='Current stock price')
buy_analysis: str = Field(description='Brief analysis for whether to buy the stock')


# This agent uses server-side web search to research stocks
stock_analysis_agent = Agent[None, StockAnalysis](
model=model,
output_type=StockAnalysis,
builtin_tools=[WebSearchTool()],
system_prompt=(
'You are a stock analysis assistant. '
'Use web_search to find the hottest performing stock from yesterday on NASDAQ. '
'Provide the current price and a brief buy analysis explaining whether this is a good buy.'
),
)


async def main():
"""Run the stock analysis agent."""
query = 'What was the hottest performing stock on NASDAQ yesterday?'

print('πŸ” Starting stock analysis...\n')
print(f'Query: {query}\n')

async with stock_analysis_agent.run_stream(query) as result:
# Stream responses as they happen
async for message, _is_last in result.stream_responses():
for part in message.parts:
if isinstance(part, BuiltinToolCallPart):
print(f'πŸ”§ Server-side tool: {part.tool_name}')

# Access output after streaming is complete
output = await result.get_output()

print('\nβœ… Analysis complete!\n')

print(f'πŸ“Š Top Stock: {output.stock_symbol}')
print(f'πŸ’° Current Price: ${output.current_price:.2f}')
print(f'\nπŸ“ˆ Buy Analysis:\n{output.buy_analysis}')

# Show usage statistics
usage = result.usage()
print('\nπŸ“Š Usage Statistics:')
print(f' Requests: {usage.requests}')
print(f' Input Tokens: {usage.input_tokens}')
print(f' Output Tokens: {usage.output_tokens}')
print(f' Total Tokens: {usage.total_tokens}')

# Show server-side tools usage if available
if usage.details and 'server_side_tools_used' in usage.details:
print(f' Server-Side Tools: {usage.details["server_side_tools_used"]}')


if __name__ == '__main__':
import asyncio

asyncio.run(main())
6 changes: 6 additions & 0 deletions pydantic_ai_slim/pydantic_ai/builtin_tools.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,7 @@ class WebSearchTool(AbstractBuiltinTool):
* OpenAI Responses
* Groq
* Google
* Grok
"""

search_context_size: Literal['low', 'medium', 'high'] = 'medium'
Expand Down Expand Up @@ -159,6 +160,7 @@ class CodeExecutionTool(AbstractBuiltinTool):
* Anthropic
* OpenAI Responses
* Google
* Grok
"""

kind: str = 'code_execution'
Expand Down Expand Up @@ -280,6 +282,7 @@ class MCPServerTool(AbstractBuiltinTool):

* OpenAI Responses
* Anthropic
* Grok
"""

id: str
Expand All @@ -298,6 +301,7 @@ class MCPServerTool(AbstractBuiltinTool):

* OpenAI Responses
* Anthropic
* Grok
"""

description: str | None = None
Expand All @@ -315,6 +319,7 @@ class MCPServerTool(AbstractBuiltinTool):

* OpenAI Responses
* Anthropic
* Grok
"""

headers: dict[str, str] | None = None
Expand All @@ -325,6 +330,7 @@ class MCPServerTool(AbstractBuiltinTool):
Supported by:

* OpenAI Responses
* Grok
"""

kind: str = 'mcp_server'
Expand Down
19 changes: 19 additions & 0 deletions pydantic_ai_slim/pydantic_ai/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -175,6 +175,24 @@
'grok:grok-3-mini-fast',
'grok:grok-4',
'grok:grok-4-0709',
'grok:grok-4-1-fast-non-reasoning',
'grok:grok-4-1-fast-reasoning',
'grok:grok-4-fast-non-reasoning',
'grok:grok-4-fast-reasoning',
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

grok-4-fast as well?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

grok-4-fast is an alias for grok-4-fast-reasoning - how should we handle those, do we need all aliases defined?

'grok:grok-code-fast-1',
'xai:grok-2-image-1212',
'xai:grok-2-vision-1212',
'xai:grok-3',
'xai:grok-3-fast',
'xai:grok-3-mini',
'xai:grok-3-mini-fast',
'xai:grok-4',
'xai:grok-4-0709',
'xai:grok-4-1-fast-non-reasoning',
'xai:grok-4-1-fast-reasoning',
'xai:grok-4-fast-non-reasoning',
'xai:grok-4-fast-reasoning',
'xai:grok-code-fast-1',
'groq:deepseek-r1-distill-llama-70b',
'groq:deepseek-r1-distill-qwen-32b',
'groq:distil-whisper-large-v3-en',
Expand Down Expand Up @@ -804,6 +822,7 @@ def infer_model( # noqa: C901
'fireworks',
'github',
'grok',
'xai',
'heroku',
'moonshotai',
'ollama',
Expand Down
Loading
Loading