Skip to content

Conversation

jsonbailey
Copy link
Contributor

@jsonbailey jsonbailey commented Oct 13, 2025

Note

Adds multi-provider (OpenAI, LangChain, Vercel) support with optional default provider for chat initialization and dynamic provider loading.

  • API:
    • LDAIClient.initChat accepts optional defaultAiProvider?: SupportedAIProvider and forwards it to provider creation.
    • Export SupportedAIProvider and SUPPORTED_AI_PROVIDERS via api/chat.
  • Chat Factory:
    • Implement multi-provider resolution with ordered fallback and optional default override.
    • Dynamically load providers: @launchdarkly/server-sdk-ai-openai (OpenAIProvider), @launchdarkly/server-sdk-ai-langchain (LangChainProvider), @launchdarkly/server-sdk-ai-vercel (VercelProvider).
    • Replace single LangChain-only path with generic _tryCreateProvider and _createProvider helpers; adjust logging and warnings.

Written by Cursor Bugbot for commit d8c8184. This will update automatically on new commits. Configure here.

@jsonbailey jsonbailey requested a review from a team as a code owner October 13, 2025 15:33
Copy link
Contributor

@launchdarkly/browser size report
This is the brotli compressed size of the ESM build.
Compressed size: 169118 bytes
Compressed size limit: 200000
Uncompressed size: 789399 bytes

Copy link
Contributor

@launchdarkly/js-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 24988 bytes
Compressed size limit: 26000
Uncompressed size: 122411 bytes

Copy link
Contributor

@launchdarkly/js-client-sdk size report
This is the brotli compressed size of the ESM build.
Compressed size: 21721 bytes
Compressed size limit: 25000
Uncompressed size: 74698 bytes

Copy link
Contributor

@launchdarkly/js-client-sdk-common size report
This is the brotli compressed size of the ESM build.
Compressed size: 17636 bytes
Compressed size limit: 20000
Uncompressed size: 90259 bytes

cursor[bot]

This comment was marked as outdated.

@jsonbailey jsonbailey merged commit 8553f24 into main Oct 13, 2025
29 checks passed
@jsonbailey jsonbailey deleted the jb/sdk-1456/add-vercel-aiprovider-support branch October 13, 2025 16:17
@github-actions github-actions bot mentioned this pull request Oct 13, 2025
jsonbailey added a commit that referenced this pull request Oct 13, 2025
🤖 I have created a release *beep* *boop*
---


<details><summary>server-sdk-ai: 0.12.0</summary>

##
[0.12.0](server-sdk-ai-v0.11.4...server-sdk-ai-v0.12.0)
(2025-10-13)


### Features

* Add support for TrackedChats in the AI SDK
([#939](#939))
([a7ad0ea](a7ad0ea))
* Add support for Vercel AIProvider to the AI SDK
([#946](#946))
([8553f24](8553f24))


### Bug Fixes

* Rename to AIProviderFactory for more accurate naming
([#949](#949))
([92323ec](92323ec))
</details>

---
This PR was generated with [Release
Please](https://github.com/googleapis/release-please). See
[documentation](https://github.com/googleapis/release-please#release-please).

<!-- CURSOR_SUMMARY -->
---

> [!NOTE]
> Release server-sdk-ai 0.12.0 adding TrackedChats and
OpenAI/LangChain/Vercel AIProviders; update manifest and example apps to
new version.
> 
> - **server-sdk-ai 0.12.0**:
> - **Features**: Support `TrackedChats`; add `OpenAI`, `LangChain`, and
`Vercel` AIProviders.
> - **Version bumps**:
> - Update `packages/sdk/server-ai/package.json` and examples
(`examples/openai`, `examples/bedrock`) to `0.12.0`.
> - Update `.release-please-manifest.json` entry for
`packages/sdk/server-ai` to `0.12.0`.
> 
> <sup>Written by [Cursor
Bugbot](https://cursor.com/dashboard?tab=bugbot) for commit
ce739d8. This will update automatically
on new commits. Configure
[here](https://cursor.com/dashboard?tab=bugbot).</sup>
<!-- /CURSOR_SUMMARY -->

---------

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Jason Bailey <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants