Skip to content

Conversation

@ryoppippi
Copy link
Member

@ryoppippi ryoppippi commented Dec 12, 2025

Summary

Reorganise README for improved readability and add missing integration documentation.

What Changed

  • Structure: Moved Development Environment section to the bottom
  • Integrations: Converted to collapsible <details> sections to reduce visual clutter
  • New integrations: Added Anthropic Claude (toAnthropic()) and OpenAI Responses API (toOpenAIResponses()) examples
  • Installation: Included dependency installation commands within each integration section
  • Package managers: Added bun as an option throughout

Why

Recent commits added new integrations (Anthropic Claude #208, OpenAI Responses API #206) but the README wasn't updated to document these. The collapsible sections improve discoverability when there are many integration options.


Summary by cubic

Reorganizes the README for easier navigation and adds missing integration examples. Adds a Usage section and moves Integrations after it, plus Anthropic Claude and OpenAI Responses support, collapsible sections, inline install commands, and bun.

  • New Features

    • Added Anthropic Claude example using toAnthropic().
    • Added OpenAI Responses API example using toOpenAIResponses().
    • Included per-integration install commands and bun support.
  • Refactors

    • Converted integration docs into collapsible details sections.
    • Moved Integrations section after Usage.
    • Moved the Development Environment section to the bottom.
    • Removed the separate “Optional: AI SDK Integration” section.
    • Removed a dead link to the non-existent custom-base-url.ts example.
    • Updated TanStack AI example to use gpt-5.1.

Written for commit 8b1484f. Summary will update automatically on new commits.

Reorganise README for improved readability and discoverability:

- Move Development Environment section to the bottom
- Convert integrations to collapsible details sections
- Add Anthropic Claude integration with toAnthropic() usage
- Add OpenAI Responses API integration with toOpenAIResponses()
- Include installation commands within each integration section
- Add bun as a package manager option throughout
- Remove separate "Optional: AI SDK Integration" section

The collapsible sections reduce visual clutter while keeping all
integration documentation easily accessible. Each integration now
includes its required dependencies inline.
@ryoppippi ryoppippi requested a review from a team as a code owner December 12, 2025 14:32
Copilot AI review requested due to automatic review settings December 12, 2025 14:32
@pkg-pr-new
Copy link

pkg-pr-new bot commented Dec 12, 2025

Open in StackBlitz

npm i https://pkg.pr.new/StackOneHQ/stackone-ai-node/@stackone/ai@223

commit: 8b1484f

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR reorganizes the README structure to improve readability and adds documentation for newly integrated AI platforms (Anthropic Claude and OpenAI Responses API). The changes focus on better organization and discoverability of integration examples.

  • Moved Development Environment section to the end for better flow
  • Added collapsible <details> sections for each integration to reduce visual clutter
  • Documented two new integrations: Anthropic Claude (toAnthropic()) and OpenAI Responses API (toOpenAIResponses())
  • Added bun as a package manager option throughout

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.


const tools = await toolset.fetchTools();

await openai.chat.completions.create({
Copy link

Copilot AI Dec 12, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variable "openai" is used here but is never defined or instantiated in this code example. An instance of OpenAI should be created before calling openai.chat.completions.create(), similar to how it's done in the OpenAI Responses API example on line 89.

Copilot uses AI. Check for mistakes.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ryoppippi that comment seems correct.

Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 issue found across 1 file

Prompt for AI agents (all 1 issues)

Check if these issues are valid — if so, understand the root cause of each and fix them.


<file name="README.md">

<violation number="1" location="README.md:51">
P1: Missing OpenAI client instantiation. The `openai` variable is used but never declared. Add `const openai = new OpenAI();` before calling `openai.chat.completions.create()`.</violation>
</file>

Reply to cubic to teach it or ask questions. Re-run a review with @cubic-dev-ai review this PR


const tools = await toolset.fetchTools();

await openai.chat.completions.create({
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Dec 12, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: Missing OpenAI client instantiation. The openai variable is used but never declared. Add const openai = new OpenAI(); before calling openai.chat.completions.create().

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At README.md, line 51:

<comment>Missing OpenAI client instantiation. The `openai` variable is used but never declared. Add `const openai = new OpenAI();` before calling `openai.chat.completions.create()`.</comment>

<file context>
@@ -21,48 +21,94 @@ yarn add @stackone/ai
+
+const tools = await toolset.fetchTools();
+
+await openai.chat.completions.create({
+  model: &quot;gpt-5.1&quot;,
+  messages: [
</file context>
Suggested change
await openai.chat.completions.create({
const openai = new OpenAI();
await openai.chat.completions.create({
Fix with Cubic

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ryoppippi this suggestion from Cubic is probably good to apply as is but i'll let you do so

Reorder sections for better logical flow:
1. Installation - how to install the package
2. Usage - basic usage with authentication and account IDs
3. Integrations - framework-specific examples (OpenAI, Anthropic, etc.)
4. Features - advanced functionality
5. Development Environment - contributor setup

Users should understand basic usage patterns before seeing
framework-specific integration examples.
Copy link
Contributor

@glebedel glebedel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

couple of comments from copilot worth reviewing - LGTM

Resolve README.md conflict by:
- Keeping all integration examples (OpenAI, Anthropic, AI SDK, TanStack AI, Claude Agent SDK)
- Reorganising structure: Installation → Usage → Integrations → Features
- Using collapsible sections for integration examples
- Removing dead link to non-existent custom-base-url.ts example
Each integration example already includes its own installation
instructions within the collapsible details section.
Each integration example now includes @stackone/ai in the npm install
command to ensure users install all required dependencies.
@ryoppippi ryoppippi merged commit 6373cfe into main Dec 12, 2025
10 checks passed
@ryoppippi ryoppippi deleted the docs/add-anthropic-openai-responses-integration branch December 12, 2025 16:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants