OpenAI Responses API support
October 1, 2025Responses API support is live in Superinterface.
We just rolled out full support for the OpenAI Responses API. Assistants now ship with native tools that feel instant, and threads stay perfectly in sync even when you hop between providers. If you have been building on the Assistants API or our managed storage, this release gives you more speed, more control, and more ways to extend what your assistants can do.
Native web search, images, and computer use
Every assistant configured with the Responses API now ships with first-party tools that feel as fast as they sound:
Web search is available out of the box, so assistants can pull fresh context without any extra setup.
File search continues to work exactly as it did with the Assistants API, indexing uploads for grounded responses.
Image generation exposes the Responses API’s rendering pipeline directly inside the dashboard.
Computer use lets assistants control a managed desktop session for complex workflows.
Code interpreter keeps running your snippets, analyses, and quick automations with the same interface you already know.
How to switch: Open an assistant in the dashboard, navigate to Threads Storage & Run Execution, and pick OpenAI Responses API. Future threads and runs for that assistant will live in OpenAI’s Responses platform.
Available tools for assistants using Responses API storage.
Image generation settings
Head into the tool settings to choose the right defaults for your use case. You can pick from Auto, Low, Medium, and High quality modes, switch between Auto or fixed aspect ratios (1024 × 1024, 1024 × 1536, 1536 × 1024), and deliver assets in PNG, WebP, or JPEG. You can also stream up to three partial renders before the final image lands, making iterative reviews feel instantaneous.
Image generation settings.
Computer use settings
Computer use now supports Linux, Windows, macOS, and browser-only environments. You control the virtual display size, attach any MCP server as the automation backbone, and lean on our open-sourced option at @supercorp-ai/computer-use-mcp. It can even stream a live video feed of the browser while the run is in progress.
File search settings
File search stays available when you switch to the Responses API, so assistants can ground answers in PDFs, spreadsheets, and docs stored alongside each thread. Configure indexing behavior, storage quotas, or purge uploads from the dedicated settings panel.
Faster threads with OpenAI Responses
The Responses API is a step-change in latency compared to the Assistants API we previously supported. Runs execute directly against the provider’s native runtime, so assistants respond noticeably faster—especially when generating longer answers or streaming code blocks.
We tuned our run scheduler and caching across all storage providers while building this release. Even if you stay on Superinterface Cloud or the Assistants API, you’ll notice quicker load times when you open large threads or replay older runs.
Native MCP tool calling
Responses now speaks MCP natively. Tool invocations run directly against the provider without our compatibility layer, cutting latency and shrinking the surface for failures. Assistants can chain tools more reliably—whether that’s our open-sourced computer use server, your own MCP integrations, or community-built endpoints—because streaming updates and final results stay in sync with the run timeline.
A dedicated tools page inside assistants
Assistant settings now include a Tools page that keeps every capability in one place. Toggle web search, file search, image generation, computer use, and code interpreter for each assistant, and dive into per-tool settings when they’re available. The page highlights which storage providers support each tool, so you always know what will work before you deploy to production.
Switch providers in seconds
You can swap over to the Responses API from the Threads Storage & Run Execution selector in your assistant form. Once you switch, new runs and messages are stored directly in OpenAI Responses API. Threads that were previously saved in the Assistants API or Superinterface Cloud stay put, so they won’t appear in the Responses view until you migrate them. We’re exploring migration tooling—let us know if you’re interested so we can move your workspace to the front of the queue.
What’s next
Azure OpenAI Responses API support is on deck. Azure already launched Responses API, but additional Conversations API is missing for full feature parity. As soon as Azure launches the Conversations API, you’ll be able to select Azure as your storage provider and get the same speedups and native tools.
We can’t wait to see what you build with faster runs, richer tools, and a cleaner setup flow. Share your feedback—and if you’d like early migration tooling, raise your hand so we can prioritize your workspace.