Interview

OpenAI Codex launches macOS app: a companion to IDEs with adaptive thinking and multimodal input

Feb 2, 2026 with Thibault Sottiaux

Key Points

  • OpenAI launches native macOS app for Codex positioning it as IDE companion, not replacement, with adaptive thinking models that scale compute to task complexity.
  • App ships with o3-mini default and full o3 model option, multimodal voice and screenshot input, and Vercel deployment integration for front-end workflows.
  • Latency, not model capability, is the primary adoption bottleneck; cutting deep research from 20 minutes to 2 minutes could drive 10x more usage and willingness to pay.
OpenAI Codex launches macOS app: a companion to IDEs with adaptive thinking and multimodal input

Summary

OpenAI launched a native macOS app for Codex, its coding agent, positioning it as a companion to existing IDEs rather than a replacement. Thibault Sottiaux, who leads the Codex team at OpenAI, built the app around two goals: accessibility for people who are technically adjacent but unwilling to spend an hour configuring a development environment, and deeper productivity for professional engineers already running multiple parallel workstreams.

Model and adaptive thinking

The app ships with o3-mini-equivalent Codex medium as the default, chosen through internal evaluations and user feedback. The model uses adaptive thinking, scaling compute to match task complexity. Faster tasks get quicker responses. Harder problems receive more deliberate processing. The fuller o3-equivalent model, GPT-4.5, is also available and is described by Sottiaux as the model the app experience was actually built around, particularly for long-running autonomous tasks that match the output quality of senior engineering staff.

A new personality toggle lets users choose between a friendly mode and the original pragmatic mode. Even members of the Codex team have switched to the friendly variant.

Multimodal input

The app bundles voice input and screenshot upload directly into the interface, removing friction for users working on front-end or visual projects. The core UX manages multiple projects and threads simultaneously without losing context, a workflow pattern Sottiaux observes among technical staff at OpenAI.

Companion, not replacement

Codex on desktop is currently a companion to IDEs, not a replacement. The longer-term direction Sottiaux describes is an agent capable enough that users primarily want to steer and supervise it rather than write code directly, with the app serving as the rich interaction surface for that supervision. Whether that trajectory eventually displaces the IDE remains open.

Deployment integration

OpenAI is leaning on an open skills standard rather than building proprietary hosting. The app ships with a Vercel deployment skill out of the box, letting users go from code to live deployment through the chat interface without reading documentation. Native hosting integration is not currently on the roadmap, though Sottiaux says it may be considered.

Routing from ChatGPT

Sottiaux acknowledges the opportunity to route ChatGPT users who encounter code interpreter or Canvas into Codex, but flags safety and security as genuine constraints when users don't understand what code is running on their machine. For now, Codex targets a technical or technical-adjacent audience. Bringing it to a broader consumer base, where the underlying code is invisible, is something the team is thinking about but hasn't solved.

Mobile

A mobile experience is confirmed as coming. Sottiaux describes the ideal as starting a task from anywhere, handing off to Codex, and steering it remotely via phone. The macOS app came first because the primary target remains professional software engineers, and the Mac was the right platform to optimize for first.

Speed as the bottleneck

Latency is suppressing usage more than capability gaps. Cutting deep research from 20 minutes to 2 minutes would likely drive 10x more usage and 10x more willingness to pay. The same logic applies to coding. Speed, not model quality, may be the more important variable for near-term adoption.