Skip to content
Open

update #1788

Show file tree
Hide file tree
Changes from 10 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "6.32.0"
".": "6.33.0"
}
20 changes: 20 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,25 @@
# Changelog

## 6.33.0 (2026-03-22)

Full Changelog: [v6.32.0...v6.33.0](https://github.com/openai/openai-node/compare/v6.32.0...v6.33.0)

### Features

* **client:** add async iterator and stream() to WebSocket classes ([e1c16ee](https://github.com/openai/openai-node/commit/e1c16ee35b8ef9db30e9a99a2b3460368f3044d0))


### Chores

* **internal:** refactor imports ([cfe9c60](https://github.com/openai/openai-node/commit/cfe9c60aa41e9ed53e7d5f9187d31baf4364f8bd))
* **tests:** bump steady to v0.19.4 ([f2e9dea](https://github.com/openai/openai-node/commit/f2e9dea844405f189cc63a1d1493de3eabfcb7e7))
* **tests:** bump steady to v0.19.5 ([37c6cf4](https://github.com/openai/openai-node/commit/37c6cf495b9a05128572f9e955211b67d01410f3))


### Refactors

* **tests:** switch from prism to steady ([47c0581](https://github.com/openai/openai-node/commit/47c0581a1923c9e700a619dd6bfa3fb93a188899))

## 6.32.0 (2026-03-17)

Full Changelog: [v6.31.0...v6.32.0](https://github.com/openai/openai-node/compare/v6.31.0...v6.32.0)
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ $ pnpm link --global openai

## Running tests

Most tests require you to [set up a mock server](https://github.com/stoplightio/prism) against the OpenAPI spec to run the tests.
Most tests require you to [set up a mock server](https://github.com/dgellow/steady) against the OpenAPI spec to run the tests.

```sh
$ ./scripts/mock
Expand Down
22 changes: 4 additions & 18 deletions ecosystem-tests/browser-direct-import/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

30 changes: 23 additions & 7 deletions ecosystem-tests/vercel-edge/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion jsr.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@openai/openai",
"version": "6.32.0",
"version": "6.33.0",
"exports": {
".": "./index.ts",
"./helpers/zod": "./helpers/zod.ts",
Expand Down
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "openai",
"version": "6.32.0",
"version": "6.33.0",
"description": "The official TypeScript library for the OpenAI API",
"author": "OpenAI <support@openai.com>",
"types": "dist/index.d.ts",
Expand Down
26 changes: 13 additions & 13 deletions scripts/mock
Original file line number Diff line number Diff line change
Expand Up @@ -19,34 +19,34 @@ fi

echo "==> Starting mock server with URL ${URL}"

# Run prism mock on the given spec
# Run steady mock on the given spec
if [ "$1" == "--daemon" ]; then
# Pre-install the package so the download doesn't eat into the startup timeout
npm exec --package=@stainless-api/prism-cli@5.15.0 -- prism --version
npm exec --package=@stdy/cli@0.19.5 -- steady --version

npm exec --package=@stainless-api/prism-cli@5.15.0 -- prism mock "$URL" &> .prism.log &
npm exec --package=@stdy/cli@0.19.5 -- steady --host 127.0.0.1 -p 4010 --validator-form-array-format=brackets --validator-query-array-format=brackets --validator-form-object-format=brackets --validator-query-object-format=brackets "$URL" &> .stdy.log &

# Wait for server to come online (max 30s)
# Wait for server to come online via health endpoint (max 30s)
echo -n "Waiting for server"
attempts=0
while ! grep -q "✖ fatal\|Prism is listening" ".prism.log" ; do
while ! curl --silent --fail "http://127.0.0.1:4010/_x-steady/health" >/dev/null 2>&1; do
if ! kill -0 $! 2>/dev/null; then
echo
cat .stdy.log
exit 1
fi
attempts=$((attempts + 1))
if [ "$attempts" -ge 300 ]; then
echo
echo "Timed out waiting for Prism server to start"
cat .prism.log
echo "Timed out waiting for Steady server to start"
cat .stdy.log
exit 1
fi
echo -n "."
sleep 0.1
done

if grep -q "✖ fatal" ".prism.log"; then
cat .prism.log
exit 1
fi

echo
else
npm exec --package=@stainless-api/prism-cli@5.15.0 -- prism mock "$URL"
npm exec --package=@stdy/cli@0.19.5 -- steady --host 127.0.0.1 -p 4010 --validator-form-array-format=brackets --validator-query-array-format=brackets --validator-form-object-format=brackets --validator-query-object-format=brackets "$URL"
fi
16 changes: 8 additions & 8 deletions scripts/test
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,8 @@ GREEN='\033[0;32m'
YELLOW='\033[0;33m'
NC='\033[0m' # No Color

function prism_is_running() {
curl --silent "http://localhost:4010" >/dev/null 2>&1
function steady_is_running() {
curl --silent "http://127.0.0.1:4010/_x-steady/health" >/dev/null 2>&1
}

kill_server_on_port() {
Expand All @@ -25,7 +25,7 @@ function is_overriding_api_base_url() {
[ -n "$TEST_API_BASE_URL" ]
}

if ! is_overriding_api_base_url && ! prism_is_running ; then
if ! is_overriding_api_base_url && ! steady_is_running ; then
# When we exit this script, make sure to kill the background mock server process
trap 'kill_server_on_port 4010' EXIT

Expand All @@ -36,19 +36,19 @@ fi
if is_overriding_api_base_url ; then
echo -e "${GREEN}✔ Running tests against ${TEST_API_BASE_URL}${NC}"
echo
elif ! prism_is_running ; then
echo -e "${RED}ERROR:${NC} The test suite will not run without a mock Prism server"
elif ! steady_is_running ; then
echo -e "${RED}ERROR:${NC} The test suite will not run without a mock Steady server"
echo -e "running against your OpenAPI spec."
echo
echo -e "To run the server, pass in the path or url of your OpenAPI"
echo -e "spec to the prism command:"
echo -e "spec to the steady command:"
echo
echo -e " \$ ${YELLOW}npm exec --package=@stainless-api/prism-cli@5.15.0 -- prism mock path/to/your.openapi.yml${NC}"
echo -e " \$ ${YELLOW}npm exec --package=@stdy/cli@0.19.5 -- steady path/to/your.openapi.yml --host 127.0.0.1 -p 4010 --validator-form-array-format=brackets --validator-query-array-format=brackets --validator-form-object-format=brackets --validator-query-object-format=brackets${NC}"
echo

exit 1
else
echo -e "${GREEN}✔ Mock prism server is running with your OpenAPI spec${NC}"
echo -e "${GREEN}✔ Mock steady server is running with your OpenAPI spec${NC}"
echo
fi

Expand Down
6 changes: 5 additions & 1 deletion src/resources/responses/internal-base.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,15 @@

import * as ResponsesAPI from './responses';
import { OpenAI } from '../../client';

import { EventEmitter } from '../../core/EventEmitter';
import { OpenAIError } from '../../core/error';
import { stringifyQuery } from '../../internal/utils';

export type ResponsesStreamMessage =
| { type: 'connecting' | 'open' | 'closing' | 'close' }
| { type: 'message'; message: ResponsesAPI.ResponsesServerEvent }
| { type: 'error'; error: WebSocketError };

export class WebSocketError extends OpenAIError {
/**
* The error data that the API sent back in an error event.
Expand Down
131 changes: 130 additions & 1 deletion src/resources/responses/ws.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
// File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.

import * as WS from 'ws';
import { ResponsesEmitter, buildURL } from './internal-base';
import { ResponsesEmitter, ResponsesStreamMessage, WebSocketError, buildURL } from './internal-base';
import * as ResponsesAPI from './responses';
import { OpenAI } from '../../client';

Expand Down Expand Up @@ -65,6 +65,135 @@ export class ResponsesWS extends ResponsesEmitter {
}
}

/**
* Returns an async iterator over WebSocket lifecycle and message events,
* providing an alternative to the event-based `.on()` API.
* The iterator will exit if the socket closes but breaking out of the iterator
* does not close the socket.
*
* @example
* ```ts
* for await (const event of connection.stream()) {
* switch (event.type) {
* case 'message':
* console.log('received:', event.message);
* break;
* case 'error':
* console.error(event.error);
* break;
* case 'close':
* console.log('connection closed');
* break;
* }
* }
* ```
*/
stream(): AsyncIterableIterator<ResponsesStreamMessage> {
return this[Symbol.asyncIterator]();
Comment on lines +91 to +92
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Add iterator support to the realtime WebSocket clients too

If a consumer applies the new for await/.stream() pattern to the other public websocket entry points (src/realtime/ws.ts, src/realtime/websocket.ts, and the src/beta/... variants), those classes still throw because they do not implement stream() or Symbol.asyncIterator. That leaves the SDK with two incompatible websocket APIs under one feature, so shared helpers over responses and realtime sockets will fail at runtime unless the sibling classes are updated as well.

Useful? React with 👍 / 👎.

}

[Symbol.asyncIterator](): AsyncIterableIterator<ResponsesStreamMessage> {
// Two-queue async iterator: `queue` buffers incoming messages,
// `resolvers` buffers waiting next() calls. A push wakes the
// oldest next(); a next() drains the oldest message.
const queue: ResponsesStreamMessage[] = [];
const resolvers: (() => void)[] = [];
let done = false;

const push = (msg: ResponsesStreamMessage) => {
queue.push(msg);
resolvers.shift()?.();
};
Comment on lines +103 to +106
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add a hard cap to the queue and define an overflow behavior? Right now this is an unbounded in-memory buffer. a slow or abandoned consumer might cause unbounded growth.


const onEvent = (event: ResponsesAPI.ResponsesServerEvent) => {
if (event.type === 'error') return; // handled by onEmitterError
push({ type: 'message', message: event });
};

// Catches both API-level and socket-level errors via _onError → _emit('error')
const onEmitterError = (err: WebSocketError) => {
push({ type: 'error', error: err });
};

const onOpen = () => {
push({ type: 'open' });
};

const flushResolvers = () => {
for (let resolver = resolvers.shift(); resolver; resolver = resolvers.shift()) {
resolver();
}
};

const onClose = () => {
push({ type: 'close' });
done = true;
flushResolvers();
cleanup();
};

const cleanup = () => {
this.off('event', onEvent);
this.off('error', onEmitterError);
this.socket.off('open', onOpen);
this.socket.off('close', onClose);
};

this.on('event', onEvent);
this.on('error', onEmitterError);
this.socket.on('open', onOpen);
this.socket.on('close', onClose);
Comment on lines +143 to +145
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Curious, does each iterator attach its own listeners? if so, this might increase memory usage. Probably not a concern but we should either enforce its ownership or document that multi-consumer duplication is intentional.


switch (this.socket.readyState) {
case WS.WebSocket.CONNECTING:
push({ type: 'connecting' });
break;
case WS.WebSocket.OPEN:
push({ type: 'open' });
break;
case WS.WebSocket.CLOSING:
push({ type: 'closing' });
break;
case WS.WebSocket.CLOSED:
push({ type: 'close' });
done = true;
cleanup();
break;
}
Comment on lines +147 to +162
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks like it could be a race condition. The socket can transition after listeners are attached BUT before the readyState switch runs. That might end up enqueueing dupes of open or close events. I wonder if we should route these through a guarded call like emitOpen / emitClose


const resolve = (res: (value: IteratorResult<ResponsesStreamMessage>) => void) => {
if (queue.length > 0) {
res({ value: queue.shift()!, done: false });
} else if (done) {
res({ value: undefined, done: true });
} else {
return false;
}
return true;
};

const next = (): Promise<IteratorResult<ResponsesStreamMessage>> =>
new Promise((res) => {
if (resolve(res)) return;
resolvers.push(() => {
resolve(res);
});
});

return {
next,
return: (): Promise<IteratorReturnResult<undefined>> => {
done = true;
cleanup();
flushResolvers();
return Promise.resolve({ value: undefined, done: true });
},
[Symbol.asyncIterator]() {
return this;
},
};
}

private authHeaders(): Record<string, string> {
return { Authorization: `Bearer ${this.client.apiKey}` };
return {};
Expand Down
Loading