Skip to content

Commit 8dbb9d3

Browse files
authored
Merge pull request #23 from guardrails-ai/0.3.10
fix python deps; update docs to include openai route
2 parents 2cb3ab1 + 963cc2b commit 8dbb9d3

17 files changed

+1059
-9
lines changed

resources/py/README.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -45,6 +45,10 @@ bash ./py-build.sh
4545
- [LLMResponse](https://github.com/guardrails-ai/guardrails-api-client/tree/main/resources/py/docs/LLMResponse.md);
4646
- [MetaData](https://github.com/guardrails-ai/guardrails-api-client/tree/main/resources/py/docs/MetaData.md);
4747
- [ModelSchema](https://github.com/guardrails-ai/guardrails-api-client/tree/main/resources/py/docs/ModelSchema.md);
48+
- [OpenAIChatCompletion](https://github.com/guardrails-ai/guardrails-api-client/tree/main/resources/py/docs/OpenAIChatCompletion.md);
49+
- [OpenAIChatCompletionPayload](https://github.com/guardrails-ai/guardrails-api-client/tree/main/resources/py/docs/OpenAIChatCompletionPayload.md);
50+
- [OpenAIChatCompletionPayloadMessagesInner](https://github.com/guardrails-ai/guardrails-api-client/tree/main/resources/py/docs/OpenAIChatCompletionPayloadMessagesInner.md);
51+
- [OpenaiApi](https://github.com/guardrails-ai/guardrails-api-client/tree/main/resources/py/docs/OpenaiApi.md);
4852
- [Outputs](https://github.com/guardrails-ai/guardrails-api-client/tree/main/resources/py/docs/Outputs.md);
4953
- [OutputsParsedOutput](https://github.com/guardrails-ai/guardrails-api-client/tree/main/resources/py/docs/OutputsParsedOutput.md);
5054
- [OutputsValidationResponse](https://github.com/guardrails-ai/guardrails-api-client/tree/main/resources/py/docs/OutputsValidationResponse.md);
Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
# OpenAIChatCompletion
2+
3+
4+
## Properties
5+
6+
Name | Type | Description | Notes
7+
------------ | ------------- | ------------- | -------------
8+
**id** | **str** | The id |
9+
**created** | **str** | The created date |
10+
**model_name** | **str** | The model name |
11+
**choices** | [**List[OpenAIChatCompletionPayloadMessagesInner]**](OpenAIChatCompletionPayloadMessagesInner.md) | |
12+
13+
## Example
14+
15+
```python
16+
from guardrails_api_client.models.open_ai_chat_completion import OpenAIChatCompletion
17+
18+
# TODO update the JSON string below
19+
json = "{}"
20+
# create an instance of OpenAIChatCompletion from a JSON string
21+
open_ai_chat_completion_instance = OpenAIChatCompletion.from_json(json)
22+
# print the JSON string representation of the object
23+
print(OpenAIChatCompletion.to_json())
24+
25+
# convert the object into a dict
26+
open_ai_chat_completion_dict = open_ai_chat_completion_instance.to_dict()
27+
# create an instance of OpenAIChatCompletion from a dict
28+
open_ai_chat_completion_from_dict = OpenAIChatCompletion.from_dict(open_ai_chat_completion_dict)
29+
```
30+
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
31+
32+
Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
# OpenAIChatCompletionPayload
2+
3+
4+
## Properties
5+
6+
Name | Type | Description | Notes
7+
------------ | ------------- | ------------- | -------------
8+
**model** | **str** | The model to use for the completion | [optional]
9+
**messages** | [**List[OpenAIChatCompletionPayloadMessagesInner]**](OpenAIChatCompletionPayloadMessagesInner.md) | The messages to use for the completion | [optional]
10+
**max_tokens** | **int** | The maximum number of tokens to generate | [optional]
11+
**temperature** | **float** | The sampling temperature | [optional]
12+
13+
## Example
14+
15+
```python
16+
from guardrails_api_client.models.open_ai_chat_completion_payload import OpenAIChatCompletionPayload
17+
18+
# TODO update the JSON string below
19+
json = "{}"
20+
# create an instance of OpenAIChatCompletionPayload from a JSON string
21+
open_ai_chat_completion_payload_instance = OpenAIChatCompletionPayload.from_json(json)
22+
# print the JSON string representation of the object
23+
print(OpenAIChatCompletionPayload.to_json())
24+
25+
# convert the object into a dict
26+
open_ai_chat_completion_payload_dict = open_ai_chat_completion_payload_instance.to_dict()
27+
# create an instance of OpenAIChatCompletionPayload from a dict
28+
open_ai_chat_completion_payload_from_dict = OpenAIChatCompletionPayload.from_dict(open_ai_chat_completion_payload_dict)
29+
```
30+
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
31+
32+
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
# OpenAIChatCompletionPayloadMessagesInner
2+
3+
4+
## Properties
5+
6+
Name | Type | Description | Notes
7+
------------ | ------------- | ------------- | -------------
8+
**role** | **str** | The role of the message | [optional]
9+
**content** | **str** | The content of the message | [optional]
10+
11+
## Example
12+
13+
```python
14+
from guardrails_api_client.models.open_ai_chat_completion_payload_messages_inner import OpenAIChatCompletionPayloadMessagesInner
15+
16+
# TODO update the JSON string below
17+
json = "{}"
18+
# create an instance of OpenAIChatCompletionPayloadMessagesInner from a JSON string
19+
open_ai_chat_completion_payload_messages_inner_instance = OpenAIChatCompletionPayloadMessagesInner.from_json(json)
20+
# print the JSON string representation of the object
21+
print(OpenAIChatCompletionPayloadMessagesInner.to_json())
22+
23+
# convert the object into a dict
24+
open_ai_chat_completion_payload_messages_inner_dict = open_ai_chat_completion_payload_messages_inner_instance.to_dict()
25+
# create an instance of OpenAIChatCompletionPayloadMessagesInner from a dict
26+
open_ai_chat_completion_payload_messages_inner_from_dict = OpenAIChatCompletionPayloadMessagesInner.from_dict(open_ai_chat_completion_payload_messages_inner_dict)
27+
```
28+
[[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
29+
30+

resources/py/docs/OpenaiApi.md

Lines changed: 96 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,96 @@
1+
# guardrails_api_client.OpenaiApi
2+
3+
All URIs are relative to *http://localhost*
4+
5+
Method | HTTP request | Description
6+
------------- | ------------- | -------------
7+
[**openai_chat_completion**](OpenaiApi.md#openai_chat_completion) | **POST** /guards/{guardName}/openai/v1/chat/completions | OpenAI SDK compatible endpoint for Chat Completions
8+
9+
10+
# **openai_chat_completion**
11+
> OpenAIChatCompletion openai_chat_completion(guard_name, open_ai_chat_completion_payload)
12+
13+
OpenAI SDK compatible endpoint for Chat Completions
14+
15+
### Example
16+
17+
* Api Key Authentication (ApiKeyAuth):
18+
* Bearer (JWT) Authentication (BearerAuth):
19+
20+
```python
21+
import guardrails_api_client
22+
from guardrails_api_client.models.open_ai_chat_completion import OpenAIChatCompletion
23+
from guardrails_api_client.models.open_ai_chat_completion_payload import OpenAIChatCompletionPayload
24+
from guardrails_api_client.rest import ApiException
25+
from pprint import pprint
26+
27+
# Defining the host is optional and defaults to http://localhost
28+
# See configuration.py for a list of all supported configuration parameters.
29+
configuration = guardrails_api_client.Configuration(
30+
host = "http://localhost"
31+
)
32+
33+
# The client must configure the authentication and authorization parameters
34+
# in accordance with the API server security policy.
35+
# Examples for each auth method are provided below, use the example that
36+
# satisfies your auth use case.
37+
38+
# Configure API key authorization: ApiKeyAuth
39+
configuration.api_key['ApiKeyAuth'] = os.environ["API_KEY"]
40+
41+
# Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
42+
# configuration.api_key_prefix['ApiKeyAuth'] = 'Bearer'
43+
44+
# Configure Bearer authorization (JWT): BearerAuth
45+
configuration = guardrails_api_client.Configuration(
46+
access_token = os.environ["BEARER_TOKEN"]
47+
)
48+
49+
# Enter a context with an instance of the API client
50+
with guardrails_api_client.ApiClient(configuration) as api_client:
51+
# Create an instance of the API class
52+
api_instance = guardrails_api_client.OpenaiApi(api_client)
53+
guard_name = 'guard_name_example' # str | Guard name
54+
open_ai_chat_completion_payload = guardrails_api_client.OpenAIChatCompletionPayload() # OpenAIChatCompletionPayload |
55+
56+
try:
57+
# OpenAI SDK compatible endpoint for Chat Completions
58+
api_response = api_instance.openai_chat_completion(guard_name, open_ai_chat_completion_payload)
59+
print("The response of OpenaiApi->openai_chat_completion:\n")
60+
pprint(api_response)
61+
except Exception as e:
62+
print("Exception when calling OpenaiApi->openai_chat_completion: %s\n" % e)
63+
```
64+
65+
66+
67+
### Parameters
68+
69+
70+
Name | Type | Description | Notes
71+
------------- | ------------- | ------------- | -------------
72+
**guard_name** | **str**| Guard name |
73+
**open_ai_chat_completion_payload** | [**OpenAIChatCompletionPayload**](OpenAIChatCompletionPayload.md)| |
74+
75+
### Return type
76+
77+
[**OpenAIChatCompletion**](OpenAIChatCompletion.md)
78+
79+
### Authorization
80+
81+
[ApiKeyAuth](../README.md#ApiKeyAuth), [BearerAuth](../README.md#BearerAuth)
82+
83+
### HTTP request headers
84+
85+
- **Content-Type**: application/json
86+
- **Accept**: application/json
87+
88+
### HTTP response details
89+
90+
| Status code | Description | Response headers |
91+
|-------------|-------------|------------------|
92+
**200** | The output of the completion | - |
93+
**0** | Unexpected error | - |
94+
95+
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
96+

resources/py/pyproject.toml.template

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,4 +27,7 @@ testpaths = [
2727
]
2828

2929
[tool.pyright]
30-
include = ["guardrails_api_client"]
30+
include = ["guardrails_api_client"]
31+
32+
[tool.ruff.lint]
33+
ignore = ["E721"]

resources/py/scripts/prebuild.js

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ ${
4141
}
4242

4343
function updateDependencies () {
44-
const pyProjectToml = fs.readFileSync(
44+
let pyProjectToml = fs.readFileSync(
4545
path.resolve('./pyproject.toml.template')
4646
).toString();
4747
const requirementsTxt = fs.readFileSync(
@@ -52,13 +52,13 @@ function updateDependencies () {
5252
const dependencies = `dependencies = [
5353
${
5454
requirements
55-
.map(r => `"${r.trim().split('\s').join('')}"`)
56-
.join('\n')
55+
.filter(r => r.length > 0)
56+
.map(r => `\t"${r.trim().split('\s').join('')}"`)
57+
.join(',\n')
5758
}
5859
]`;
5960

60-
pyProjectToml.replace('dependencies = []', dependencies)
61-
61+
pyProjectToml = pyProjectToml.replace('dependencies = []', dependencies)
6262

6363

6464
fs.writeFileSync(

resources/ts/docs/classes/BaseAPI.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,8 @@ This is the base class for all generated API classes.
1010

1111
[`GuardApi`](GuardApi.md)
1212

13+
[`OpenaiApi`](OpenaiApi.md)
14+
1315
[`ServiceHealthApi`](ServiceHealthApi.md)
1416

1517
[`ValidateApi`](ValidateApi.md)

0 commit comments

Comments
 (0)