iOS 26 screenshots could be an intriguing preview of Apple's delayed Siri rework

When it launched, Apple 's Visual Intelligence feature allowed you to point your compatible phone's camera at things around you and either perform a Google Image Search or ask questions via ChatGPT. At WWDC 2025, the company showed off updates to broaden the usefulness of Visual Intelligence, largely by embedding it into the screenshots system. To quote the company's press release, "Visual intelligence already helps users learn about objects and places around them using their iPhone camera, and it now enables users to do more, faster, with the content on their iPhone screen."

This reminded me of the "onscreen awareness" that Apple described as one of Siri's capabilities when it announced Apple Intelligence last year. In that press release, the company said, "With onscreen awareness, Siri will be able to understand and take action with users’ content in more apps over time." Though it's not quite the same, the updated screenshot-based Visual Intelligence more or less allows for your iPhone to serve up contextual actions from your onscreen content, just not via Siri. 

In a way, it makes sense. Most people are already accustomed to taking a screenshot when they want to share or save important information they saw on a website or Instagram post. Integrating Apple Intelligence actions here would theoretically put the tools where you expect them, rather than make users talk to Siri (or wait for the update to roll out).

Basically, in iOS 26 (on devices that support Apple Intelligence), pressing the power and volume down buttons to take a screenshot will result in a new page being pulled up. Instead of the thumbnail of your saved image appearing in the bottom left, you'll see the picture take up almost all of the display, with options around it for editing, sharing or saving the file, as well as getting Apple Intelligence-based answers and actions at the bottom. In the bottom left and right corners sit options for asking ChatGPT and doing a Google Image Search respectively.

Depending on what's in your screenshot, Apple Intelligence can suggest various actions below your image. This can be asking where to buy a similar-looking item, adding an event to your calendar or identifying types of plants, animals or food, for instance. If there's a lot going on in your screenshot, you can draw on an item to highlight it (similar to how you select an object to erase in Photos) and get information specific to that part of the image. 

Third-party apps or services that have enabled App Intents, like Google, Etsy and Pinterest, can also appear here so you can carry out actions within this space too. For example, if you've found a bookend you like, taken a screenshot and identified it, you can shop for it on Etsy or pin it on Pinterest. 

One aspect of this update to Visual Intelligence that gives me pause is that, for people like me who screenshot mindlessly and don't want to do anything other than get receipts, this might add a frustrating step between capturing a screenshot and saving it to Photos. It sounds like you may be able to turn off this interface and stick to the existing screenshot system, though.

The examples that Apple gave for Siri's ability to understand what's on your screen felt somewhat similar. In its press release from last year, Apple said "For example, if a friend texts a user their new address in Messages, the receiver can say, 'Add this address to his contact card.'" 

Like Visual Intelligence in screenshots, this involves scanning the onscreen content for pertinent information and helping you put it in a place (like Contacts or Calendar) where it's most useful. However, the promise of Siri's new era was more about interacting with all parts of your phone, across first- and third-party apps alike. So you could ask the assistant to open an article you added to your Reading List in Safari or send photos from a specified event to a contact.

It's clear Apple has yet to deliver these advancements to Siri, and like Craig Federighi said at the WWDC 2025 keynote, those might only be discussed later this year. Still, as we await that status update, the changes coming to screenshots might be a preview of things to come.

This article originally appeared on Engadget at https://www.engadget.com/ai/ios-26-screenshots-could-be-an-intriguing-preview-of-apples-delayed-siri-rework-183005404.html?src=rss https://www.engadget.com/ai/ios-26-screenshots-could-be-an-intriguing-preview-of-apples-delayed-siri-rework-183005404.html?src=rss
Établi 1d | 11 juin 2025, 19:20:16


Connectez-vous pour ajouter un commentaire

Autres messages de ce groupe

The Nothing Phone 3 will be available in the US via Amazon

The Nothing Phone 3 is getting an official release for US customers. The handset will be available to purchase from Amazon or directly from the company,

12 juin 2025, 20:40:18 | Engadget
Apple will at long last let you customize snooze times on alarms in iOS 26

We've been covering all the news Apple announced at

12 juin 2025, 20:40:17 | Engadget
Mixtape turned me back into a Millennial teenage dirtbag

Mixtape is the answer to the question, “What if the movie ">High Fidelity was a

12 juin 2025, 20:40:16 | Engadget
Google Cloud outages: Spotify, Discord, Snapchat and more partially down

Google Cloud experienced outages today that led to disruptions for many online servi

12 juin 2025, 20:40:14 | Engadget
Razer's new Kishi V3 controllers can fit up to a 13-inch iPad

Razer has announced its latest lineup of Kishi mobile gaming controll

12 juin 2025, 18:30:37 | Engadget
Star Trek: Strange New Worlds will end with a truncated fifth season

Star Trek: Strange New Worlds will only visit around 26 strange new worlds before shuttling into that cancellation sunset. The show will end with a truncated fifth season of six episodes,

12 juin 2025, 18:30:36 | Engadget