Skip to content

Conversation

sameelarif
Copy link
Member

@sameelarif sameelarif commented Apr 23, 2025

why

  • users shouldn't have to copy/paste our aisdk examples

what changed

  • made aisdk a stagehand native client

test plan

  • evals

Copy link

changeset-bot bot commented Apr 23, 2025

🦋 Changeset detected

Latest commit: 35dd6c7

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
@browserbasehq/stagehand Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@sameelarif sameelarif requested a review from kamath April 23, 2025 18:33
@miguelg719
Copy link
Collaborator

miguelg719 commented Apr 23, 2025

I started looking into this yesterday; it seems like if we're going with aisdk we should probably change our internal inference logic to adapt to generateText, generateObject, and maybe streamText (useful for operator); meaning get rrid of createChatCompletion and upgrade

@miguelg719 miguelg719 marked this pull request as ready for review April 24, 2025 21:26
Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Summary

This PR integrates the AI SDK as a first-class LLM client in Stagehand, moving from an external implementation to a native one with enhanced logging and caching capabilities.

  • Added lib/llm/aisdk.ts implementing AISdkClient with support for multiple AI providers (OpenAI, Google, Anthropic, Groq, Cerebras)
  • Modified model naming convention to use format aisdk/provider/model (e.g. aisdk/openai/gpt-4o) for simplified integration
  • Moved AI SDK dependencies from devDependencies to optionalDependencies in package.json, allowing selective provider installation
  • Added support for provider-specific API keys in environment variables through lib/index.ts
  • Enhanced types/model.ts to support flexible model strings while maintaining type safety

10 file(s) reviewed, 8 comment(s)
Edit PR Review Bot Settings | Greptile

]);

export type AvailableModel = z.infer<typeof AvailableModelSchema>;
export type AvailableModel = z.infer<typeof AvailableModelSchema> | string;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

style: Adding string to AvailableModel type weakens type safety. Consider using a branded type or maintaining an explicit list of supported models to prevent runtime errors from invalid model names.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why are we deleting this file? we still need for when we wrap AI SDK for evals to get telemetry

import { groq } from "@ai-sdk/groq";
import { cerebras } from "@ai-sdk/cerebras";
import { openai } from "@ai-sdk/openai";
import { AISdkClient } from "@/lib/llm/aisdk";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

evals should only reference dist, not lib

@kamath kamath merged commit c145bc1 into main Apr 29, 2025
13 checks passed
@github-actions github-actions bot mentioned this pull request Apr 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants