Commentary

Siri needs an app — and Google will eventually pay Apple for LLM routing

Jan 13, 2026

Key Points

  • Apple's $1 billion Google deal to power Siri with Gemini is a near-term licensing arrangement that will flip economically in Google's favor once LLM inference becomes profitable enough to monetize.
  • Apple has 500 million active Siri users but usage lags the iPhone base, signaling the assistant failed as a core product despite 14 years in market.
  • As inference costs fall and commercial queries generate revenue through affiliate fees and ads, Apple will gain leverage to charge Google for routing traffic rather than pay for capability.

Summary

Apple's $1 billion deal with Google to power Siri with Gemini is a near-term licensing arrangement, but the long-term economics will flip in Google's favor as LLM inference becomes profitable enough to monetize.

Apple has roughly 500 million active Siri users, about a third of its 1.5 billion iPhone base, despite Siri's presence since 2010. The gap shows the assistant has failed to become a core part of the user experience. Integrating Gemini addresses Siri's capability problem, but the deal exposes a structural tension: as query economics improve, Google will eventually pay Apple for routing traffic rather than the reverse.

Most LLM queries today are loss-making. Weather checks and simple facts don't generate revenue. Commercial queries—order me a TV, shopping recommendations, insurance lookups—can be monetized through affiliate fees, transaction revenue, or ads embedded in responses. As inference costs fall and per-query value rises, the math inverts. Queries that cost money to run today will generate profit tomorrow, at which point Apple will have leverage to charge for routing. This mirrors Google's search business, where informational queries subsidize high-intent commercial ones.

Whether Siri needs a dedicated app remains contested. One argument holds that the chat interface pattern is now dominant enough that users expect scrollable history, context persistence, and the ability to resume conversations. These features require app-like interfaces. The counterargument is that model implementation matters far more than UI scaffolding, especially given Apple's track record with AI execution. Apple is hiring 300 Siri-focused roles, suggesting serious investment in capability over distribution form.

The tension is real but not binary. Apple could ship a better Gemini-powered Siri without a dedicated app first, then add one later. But as LLM responses grow longer and users expect more conversational back-and-forth, voice-only or shallow UI integration becomes harder to sustain. An app would let users scroll past responses, search history semantically, and switch contexts without re-prompting. Chat apps have normalized these UX patterns.

For Apple, the priority should be making Siri genuinely useful before worrying about how to package it. For Google, the priority is access to Apple's 500 million users and a foot in the door before agentic AI shifts the whole category toward direct payment models.