Skip to content

Conversation

@ashwinb
Copy link
Contributor

@ashwinb ashwinb commented Nov 14, 2025

Mirroring the python SDK. Used by integration tests.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Nov 14, 2025
@ashwinb ashwinb changed the title feat(tests): add integration tests feat: add output_text property on Response object Nov 19, 2025
@ashwinb ashwinb merged commit bced728 into main Nov 19, 2025
8 checks passed
@ashwinb ashwinb deleted the add_ci branch November 19, 2025 00:38
@stainless-app stainless-app bot mentioned this pull request Nov 19, 2025
ashwinb added a commit that referenced this pull request Nov 19, 2025
Adds a simple helper function to extract aggregated text output from
ResponseObject instances. This is particularly useful for streaming
responses where chunk.response objects don't automatically have the
output_text property that non-streaming responses get from PR #42.

Usage:
  const stream = await client.responses.create({ stream: true, ... });
  for await (const chunk of stream) {
    if (chunk.type === 'response.completed') {
      const text = getOutputText(chunk.response);
    }
  }
ashwinb added a commit that referenced this pull request Nov 19, 2025
Adds a simple helper function to extract aggregated text output from
ResponseObject instances. This is particularly useful for streaming
responses where chunk.response objects don't automatically have the
output_text property that non-streaming responses get from PR #42.

Usage:
  import { getResponseOutputText } from 'llama-stack-client';

  const stream = await client.responses.create({ stream: true, ... });
  for await (const chunk of stream) {
    if (chunk.type === 'response.completed') {
      const text = getResponseOutputText(chunk.response);
    }
  }
ashwinb added a commit that referenced this pull request Nov 19, 2025
Adds a simple helper function to extract aggregated text output from
ResponseObject instances.

This replaces the automatic output_text property approach from PR #42
with an explicit helper function, providing a cleaner API without
prototype patching.

Usage:
  import { getResponseOutputText } from 'llama-stack-client';

  const stream = await client.responses.create({ stream: true, ... });
  for await (const chunk of stream) {
    if (chunk.type === 'response.completed') {
      const text = getResponseOutputText(chunk.response);
    }
  }

  // Also works for non-streaming responses
  const response = await client.responses.create({ stream: false, ... });
  const text = getResponseOutputText(response);
ashwinb added a commit that referenced this pull request Nov 19, 2025
…#48)

Adds a simple helper function to extract aggregated text output from
ResponseObject instances.

This **replaces** the automatic `output_text` property approach from PR
#42 with an explicit helper function, providing a cleaner API without
prototype patching or side-effectful imports.

## Changes

- **Removes** `src/lib/init.ts` (automatic property patching)
- **Adds** `getResponseOutputText()` helper function
- Works for both streaming and non-streaming responses

## Usage

### Streaming responses
```typescript
import { getResponseOutputText } from 'llama-stack-client';

const stream = await client.responses.create({ stream: true, ... });
for await (const chunk of stream) {
  if (chunk.type === 'response.completed') {
    const text = getResponseOutputText(chunk.response);
    console.log(text);
  }
}
```

### Non-streaming responses
```typescript
import { getResponseOutputText } from 'llama-stack-client';

const response = await client.responses.create({ stream: false, ... });
const text = getResponseOutputText(response);
```

This provides a clean, explicit API for accessing aggregated text from
responses without relying on magic property injection.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants