ADVERTISEMENT

Apple is preparing a significant overhaul of its iPhone camera experience, with a new Siri-powered mode expected to debut in iOS 27, according to a Bloomberg report.
The new “Siri mode” will sit alongside existing camera options such as Photo and Video, embedding artificial intelligence directly into the camera interface. The feature builds on Apple’s existing Visual Intelligence system but shifts it from a relatively limited, hardware-linked function into a more visible, app-level experience. This repositioning indicates Apple’s intent to make AI-driven visual interactions a core part of how users engage with the camera.
Under the new setup, users will be able to point the camera at objects, text or scenes and interact with them through Siri, including querying information or triggering actions. The mode is also expected to integrate external AI services such as OpenAI’s ChatGPT and image search tools to deliver contextual responses, reflecting a broader move to combine on-device and cloud-based intelligence.
The update effectively expands Visual Intelligence into a primary feature rather than a secondary shortcut. Apple is also redesigning elements of the interface, including a shutter button aligned with its “Apple Intelligence” branding to signal when AI features are active.
Expanded capabilities are expected to include scanning nutrition labels, extracting contact details from printed material and generating actionable outputs such as saving information directly to apps. These additions point to a shift from passive image capture to real-time interpretation and task execution.
The move is part of a wider push to embed artificial intelligence across Apple’s ecosystem, with iOS 27 also likely to introduce AI-led photo editing tools and deeper Siri integration across applications. The update builds on Apple’s recent efforts to reposition Siri as a more capable assistant amid intensifying competition in consumer AI.
The reported changes also suggest a tighter coupling between Apple’s camera stack and its broader AI architecture, where image capture becomes an entry point for contextual computing rather than a standalone function. By embedding Siri at the point of capture, Apple appears to be aligning hardware, software and services more closely, thereby potentially increasing user reliance on its ecosystem.