Remove Responses API passthrough tests that need real /v1/responses

OpenAI model Responses API requests pass through to /v1/responses on the
upstream, which doesn't work with mock servers. Remove those tests from
the mock suite (they're covered by live e2e tests on main/nightly).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Adil Hafeez 2026-02-18 23:58:34 +00:00
parent d8e5e48f4a
commit a39e61ddeb
2 changed files with 6 additions and 37 deletions

View file

@ -241,11 +241,13 @@ def test_anthropic_client_streaming_openai_upstream(httpserver: HTTPServer):
def test_responses_api_streaming_basic(httpserver: HTTPServer):
"""Responses API streaming: verify event types and content assembly"""
# Gateway translates Responses API to /v1/chat/completions on upstream
# for non-OpenAI models (OpenAI models pass through to /v1/responses which
# doesn't work with mocks)
setup_openai_chat_mock(httpserver, content="Responses API streaming works!")
client = openai.OpenAI(api_key="test-key", base_url=f"{LLM_GATEWAY_BASE}/v1")
stream = client.responses.create(
model="gpt-4o",
model="claude-sonnet-4-20250514",
input="Hello",
stream=True,
)