You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+18-91Lines changed: 18 additions & 91 deletions
Original file line number
Diff line number
Diff line change
@@ -170,95 +170,11 @@ We recommend you *avoid* using this module-level client your application code be
170
170
- It's harder to mock for testing purposes.
171
171
- It's impossible to control cleanup of network connections.
172
172
173
-
## Using types
173
+
## Request types
174
174
175
-
Nested request parameters are [TypedDicts][typing.TypedDict]. Responses are [Pydantic models](https://docs.pydantic.dev), which provide helper methods for things like:
175
+
Nested **request** parameters are Python [TypedDicts][typing.TypedDict].
176
176
177
-
- Serializing back into JSON: [`model.model_dump_json`][src.openai.BaseModel.model_dump_json]`(indent=2, exclude_unset=True)`
178
-
- Converting to a dictionary: [`model.model_dump`][src.openai.BaseModel.model_dump_json]`(exclude_unset=True)`
179
-
180
-
### Enable type checking in Visual Studio Code
181
-
182
-
Typed requests and responses provide autocomplete and documentation in your editor.
183
-
184
-
To help catch bugs earlier, enable [type checking in Pylance](https://marketplace.visualstudio.com/items?itemName=ms-python.vscode-pylance) in VS Code by setting `python.analysis.typeCheckingMode` to `basic` as described in **Settings and Customization** on the Marketplace page for Pylance.
185
-
186
-
## Pagination
187
-
188
-
List methods in the OpenAI API are paginated and Python library provides auto-paginating iterators on list responses - you don't need to request manually request successive pages.
189
-
190
-
This example shows using auto-pagination when [listing fine tuning jobs][src.openai.resources.fine_tuning.Jobs.list]:
191
-
192
-
```python
193
-
import openai
194
-
195
-
client = OpenAI()
196
-
197
-
all_jobs = []
198
-
# Automatically fetches more pages as needed.
199
-
for job in client.fine_tuning.jobs.list(
200
-
limit=20,
201
-
):
202
-
# Do something with job here
203
-
all_jobs.append(job)
204
-
print(all_jobs)
205
-
```
206
-
207
-
Auto-pagination is also supported when [listing asynchrous fine tuning jobs][src.openai.resources.fine_tuning.AsyncJobs.list]:
208
-
209
-
```python
210
-
import asyncio
211
-
import openai
212
-
213
-
client = AsyncOpenAI()
214
-
215
-
216
-
asyncdefmain() -> None:
217
-
all_jobs = []
218
-
# Iterate through items across all pages, issuing requests as needed.
219
-
asyncfor job in client.fine_tuning.jobs.list(
220
-
limit=20,
221
-
):
222
-
all_jobs.append(job)
223
-
print(all_jobs)
224
-
225
-
226
-
asyncio.run(main())
227
-
```
228
-
229
-
### Manual pagination
230
-
231
-
For more granular control of pagination, you can instead choose to use the `.has_next_page()`, [`.next_page_info()`][src.openai.pagination.SyncCursorPage.next_page_info], or `.get_next_page()` methods:
232
-
233
-
```python
234
-
first_page =await client.fine_tuning.jobs.list(
235
-
limit=20,
236
-
)
237
-
if first_page.has_next_page():
238
-
print(f"will fetch next page using these details: {first_page.next_page_info()}")
239
-
next_page =await first_page.get_next_page()
240
-
print(f"number of items we just fetched: {len(next_page.data)}")
Nested parameters are dictionaries, typed using [TypedDict][typing.TypedDict], for example:
177
+
For example, the user message in the following [`chat.completions.create()`][src.openai.resources.chat.completions.Completions.create] request is a [`ChatCompletionUserMessageParam`][src.openai.types.chat.chat_completion_user_message_param.ChatCompletionUserMessageParam], which has a base type of [`TypedDict`][typing.TypedDict]:
"content": "Can you generate an example json object describing a fruit?",
188
+
"content": "Can you generate an example JSON object describing a fruit?",
273
189
}
274
190
],
275
191
model="gpt-3.5-turbo-1106",
276
192
response_format={"type": "json_object"},
277
193
)
278
194
```
279
195
280
-
## File uploads
196
+
###File upload request types
281
197
282
-
Request parameters that correspond to file uploads can be passed as `bytes`, a [`PathLike`][os.PathLike] instance or a tuple of `(filename, contents, media type)`.
198
+
Request parameters that correspond to file uploads can be passed as [`bytes`][bytes], a [`PathLike`][os.PathLike] instance, or a tuple of `(filename, contents, media type)`.
283
199
284
200
```python
285
201
from pathlib import Path
@@ -293,7 +209,18 @@ client.files.create(
293
209
)
294
210
```
295
211
296
-
The async client uses the exact same interface. If you pass a [`PathLike`][os.PathLike] instance, the file contents will be read asynchronously automatically.
212
+
The async client uses the same interface. If you pass a [`PathLike`][os.PathLike] instance, the file contents will be read asynchronously automatically.
213
+
214
+
## Response types
215
+
216
+
**Responses** are [Pydantic](https://docs.pydantic.dev) models that include their helper methods for things like:
217
+
218
+
- Serializing the object to JSON: [`example_response_object.model_dump_json`][src.openai.BaseModel.model_dump_json]`(indent=2, exclude_unset=True)`
219
+
- Converting the object to a dictionary: [`example_response_object.model_dump`][src.openai.BaseModel.model_dump]`(exclude_unset=True)`
220
+
221
+
!!! Tip
222
+
223
+
Typed requests and responses enable type checking, autocompletion, and hover-help documentation in editors that support those features. In Visual Studio Code, for example, you can [enable type checking in Pylance](https://marketplace.visualstudio.com/items?itemName=ms-python.vscode-pylance) by setting `python.analysis.typeCheckingMode` to `basic` as described in that article's **Settings and Customization** section.
The these docs are officially unofficial and unsupported, but you're welcome to use and improve them until OpenAI brings up their own set (1) or they ask me to take them down.
12
+
These docs are officially unofficial and unsupported, but you're welcome to use and improve them until OpenAI brings up their own (1) or they ask me to take them down.
13
13
{ .annotate }
14
14
15
15
1. I'll likely decommission this site when OpenAI [publishes their own Python API reference](https://community.openai.com/t/where-is-the-documentation-for-the-python-openai-sdk/583643).
@@ -22,14 +22,13 @@ You might instead prefer to use OpenAI's official docs on [OpenAI's official doc
22
22
23
23
### Unsupported
24
24
25
-
**The documentation on this site is provided AS-IS and with NO WARRANTY.** The API reference on this site is generated from the OpenAI Python library's code, but I make no promises nor guarantee these docs reflect the current, past, or future state of the library.
25
+
**The documentation on this site is provided AS-IS with NO WARRANTY.** The API reference is indeed generated from the OpenAI Python library's code, but I make no promises nor guarantee these docs reflect the current, past, or future state of the library.
26
26
27
-
That said, I use these docs myself and thus intend to keep them (mostly) current, but there's no automation pulling content from their repo to this fork. (1)
27
+
That said, I use these docs myself and thus intend to keep them (mostly) current. However, there's no automation pulling content from their repo to this fork. (1)
28
28
{ .annotate }
29
29
30
-
1.This means you might encounter inaccuracies or you might not find what you think should be here. In either case, you should refer to [openai/openai-python](https://github.com/openai/openai-python) as the source of truth.
30
+
1.That means you might encounter inaccuracies or you might not find what you think should be here. In either case, you should refer to [openai/openai-python](https://github.com/openai/openai-python) as the source of truth.
31
31
32
-
___
33
32
!!! quote
34
33
35
34
If these docs help you, yay! If they don't, don't use 'em. Enjoy! *—[Marsh](https://github.com/mmacy)*
0 commit comments