At WWDC 2025, Apple revealed a major visual shake-up for iOS (not to mention the rest of the company’s operating systems). This is the biggest change, aesthetically, since the shift away from the stitching, textures and skeuomorphic design of the iOS 4. It also comes with significantly fewer AI and Siri updates this time around. However, it’s the smaller touches that make iOS 26 seem like a notable improvement over its predecessor.
I’ve been running the iOS 26 developer beta for the last two weeks and here's how Apple’s new Liquid Glass design — and iOS 26 broadly — stacks up.
(Ed. note: Apple just released the public betas for iOS 26, iPadOS 26, macOS 26 and watchOS 26. This means you can run the preview for yourself, if you are willing to risk potentially buggy or unstable software that could cause some of your apps to not work. As usual, we highly recommend backing up all your date before running any beta, and you can follow our guide on how to install Apple's public betas to do so.)
Liquid Glass changes everything
iOS 26 looks new and modern. And for once, how Apple describes it — liquid glass — makes sense: it’s a lot of layers of transparent elements overlapping and, in places, the animations are quite… liquidy. Menus and buttons will respond to your touch, with some of them coalescing around your finger and sometimes separating out into new menus.
Liquid Glass encompasses the entire design of iOS. The home and lock screens have been redesigned once again, featuring a new skyscraping clock font that stretches out from the background of your photos, with ever-so-slight transparency. There’s also a new 3D effect that infuses your photos with a bit of spatial magic, offering a touch of Vision Pro for iPhone users.
The experience in the first few builds of the iOS 26 beta was jarring and messy, especially with transparent icons and notifications, due to those overlapping elements making things almost illegible. Updates across subsequent releases have addressed this issue by making floating elements more opaque. There is also a toggle within the Accessibility tab in Settings to reduce transparency further, but I hope Apple offers a slider so that users can choose exactly how “liquid” they want their “glass” to be. If you own other Apple products, then you’ll come to appreciate the design parity across your Mac, iPad and Apple Watch.
One noticeable change I'd been waiting for was the iOS search bar’s relocation to the bottom of the screen. I first noticed it within Settings, but it reappears in Music, Podcasts, Photos and pretty much everywhere you might need to find specific files or menu items now. If, like me, you’re an iPhone Pro or Plus user, you may have struggled to reach those search bars when they were at the top of the screen. It’s a welcome improvement.
Visual Intelligence
With iOS 26 on iPhones powerful enough to run Apple Intelligence, the company is bringing Visual Intelligence over to your screenshots. (Previously it was limited to Camera.) Once you’ve grabbed a shot by pressing the power and volume up buttons, you’ll get a preview of your image, surrounded by suggested actions that Apple Intelligence deduced would be relevant based on the contents of your screenshot.
Managing Editor Cherlynn Low did a deep dive on what Visual Intelligence is capable of. From a screenshot, you can transfer information to other apps without having to switch or select them manually. This means I can easily screenshot tickets and emails, for example, to add to my calendar. Apple Intelligence can also identify types of plants, food and cars, even. If there are multiple people or objects in your screenshot, you can highlight what you want to focus on by circling it. There aren’t many third-party app options at this point, but that’s often the case with a beta build. These are features that Android users have had courtesy of Gemini for a year or two, but at least now we get something similar on iPhones.
One quick tip: Make sure to tap the markup button (the little pencil tip icon) to see Visual Intelligence in your screenshots. I initially thought my beta build was missing the feature, but it was just hidden behind the markup menu.
More broadly, Apple Intelligence continues to work well, but doesn't stand out in any particular way. We’re still waiting for Siri to receive its promised upgrades. Still, iOS 26 appears to have improved the performance of many features that use the iPhone’s onboard machine learning models. Since the first developer build, voice memos and voice notes are not only much faster, but also more accurate, especially with accents that the system previously struggled with.
Apple Intelligence’s Writing tools — which I mainly use for summarizing meetings, conference calls and even lengthy PDFs — doesn't choke with more substantial reading. On iOS 18, it would struggle with voice notes longer than 10 minutes, trying to detangle or structure the contents of a meeting. I haven’t had that issue with iOS 26 so far.
Genmoji and Image Playground both offer up different results through the update. Image Playground can now generate pictures using ChatGPT. I’ll be honest, I hadn’t used the app since I tested it on iOS 18, but the upgrades mean it has more utility when I might want to generate AI artwork, which can occasionally reach photorealistic levels.
One useful addition is ChatGPT’s “any style” option, meaning you can try to specify the style you have in mind, which can skirt a little closer to contentious mimicry — especially if you want, say, a frivolous image of you splashing in a puddle, Studio Ghibli style.
Apple also tweaked Genmoji to add deeper customization options, but these AI-generated avatars don’t look like me? I liked the original Genmoji that launched last year, which had the almost-nostalgic style of 2010 emoji, but still somehow channeled the auras of me, my friends and family. This new batch are more detailed and elaborate, sure, but they don’t look right. Also, they make me look bald. And contrary to my detractors, I am not bald. Yet. This feels like a direct attack, Apple.
You might feel differently, however. For example, Cherlynn said that the first version of Genmoji did not resemble her, frequently presenting her as someone with much darker skin or of a different ethnicity, regardless of the source picture she submitted.
Still, the ability to change a Genmoji’s expression, as well as add and remove glasses and facial hair through the new appearance customization options, is an improvement.
A Camera app redesign for everyone
Apple has revisited the camera app, returning to basics by stripping away most of the previously offered modes and settings — at least initially — to display only video and photo modes.
You can swipe up from the bottom to see additional options, like flash, the timer, exposure, styles and more. You can also tap on the new six-dot icon in the upper right of the interface for the same options, though that requires a bit more of a reach. These behave in line with the new Liquid Glass design and you’ll see the Photo pill expand into the settings menu when you press either area. Long-pressing on icons lets you go deeper into shooting modes, adjusting frame rates and even recording resolutions.
What I like here is that it benefits casual smartphone photographers while keeping all the settings that more advanced users demand. None of the updates here are earth-shattering, though. I hope Apple takes a good look at what Adobe’s Project Indigo camera app is doing — there are a lot of good ideas there.
One extra improvement if you use AirPods: Pressing and holding the stem of your AirPod (if it has an H2 chip) can now start video recording.
Apple Music tries to DJ
Alongside the Liquid Glass design touches, the big addition to Apple Music this year is AutoMix. Like a (much) more advanced version of the crossfade feature found on most music streaming apps, in iOS 26, Music tries to mix between tracks, slowing or speeding up tempos, gently fading in drums or bass loops before the next song kicks in. Twenty percent of the time, it doesn’t work well — or Apple Music doesn’t even try. But the new ability to pin playlists and albums is useful, especially for recommendations from other folks that you never got around to listening to.
Messages get a little more fun
Apple is making Messages more fun. One of the ways it’s doing so is by enabling custom backgrounds in chats, much like in WhatsApp. I immediately set out to find the most embarrassing photo of my colleague (and frenemy) Cherlynn Low and make it our chat background. I know she’s also running iOS 26 in beta, so she will see it. [Ed. note: Way to give me a reason to ignore your messages, Mat!]
Apple’s Live translation now works across Messages, voice calls and FaceTime. Setting things up can be a little complicated — you’ll first need to download various language files to use the feature. There’s also some inconsistency in the languages supported across the board. For instance, Mandarin and Japanese are supported in Messages, but not on FaceTime yet. In chats, if your system language is set to English or Spanish, then you’ll only be able to translate into English or Spanish. For those polyglots out there, if you want to translate incoming Japanese texts into German, you’ll need to set your device's language to German.
While I didn’t get to flex my Japanese abilities on voice calls and FaceTime, iOS 26 was more than capable of keeping up with some rudimentary German and Spanish. I’m not sure if I’d rely on it for serious business translation or holiday bookings, but I think it could be a very useful tool for basics.
There’s also the ability to filter spam messages to their own little folder (purgatory). Spam texts remain a nightmare, so I appreciate any potential weapons in the fight. Sadly, it hasn’t quite manage to deal with the TikTok marketing agencies and phone network customer services that continue to barrage my Messages. Still, hopefully Apple will continue to improve its detection algorithms.
One more tool in the battle against spam: You can mute notifications for Messages from unknown numbers, although time-sensitive alerts from delivery services and rideshare apps will still reach you.
New apps are hit-or-miss
Not everything in the beta lands, however. I’ve already touched on how Liquid Glass was initially a semi-transparent mess. The Games App, too, seems like an unnecessary addition. Because it’s a blend of the Games tab of the App Store and a silo of your preinstalled games, I’m not sure what it’s adding. It’s not any easier to navigate, nor does it introduce me to games I want to buy.
Cherlynn did want to highlight that for a casual gamer like herself, it’s intriguing to see if the Games app might start to recommend more mind-numbing puzzles or farming simulations. She was also intrigued by the idea of a more social gaming experience on iOS, issuing challenges to her friends. Still, because the phone she has been testing the beta on doesn’t have access to all her contacts or her gaming history, the recommendations and features are fairly limited at the moment.
Games is one of two new apps that will automatically join your home screen. (Fortunately, they can be uninstalled). The other is Preview, which should be a familiar addition to any Mac user. It offers an easy way to view sent or downloaded files, like menus, ticket QR codes and more. During the developer beta, the app pulled in a handful of my documents that previously lived in the Files app. Navigation across both those apps is identical, although Preview is limited to files you can actually open, of course.
AirPods, upgraded
This is more iPhone-adjacent, but iOS 26 includes several quality-of-life improvements for some of Apple’s headphones. First up: notifications when your AirPods are fully charged, finally! The Apple Watch got this kind of notification back in iOS 14, so it’s great to see Apple’s headphones catch up.
Apple is also promising “studio-quality sound recording” from the AirPods, augmenting recordings with computational audio improvements. There’s a noticable bump in audio quality. It appears that AirPods 4 and AirPods Pro 2 will record files at a sample rate of 48 kHz, which is double the rate used in the past. The sample rate bump happened last year, but it is dependent on what the app you're using. Is it “studio quality”? I don’t think so, but it’s an improvement. While recordings sound slightly better in quiet locations, the bigger change is in loud environments. The algorithm doesn't appear to be degrading audio quality as much while trying to reduce background noise.
iOS 26 also adds sleep detection to the buds. If the AirPods detect minimal movement, they’ll switch off automatically, which could be helpful for the next time I’m flying long-haul.
Wrap-up
In iOS 26, Apple has prioritized design changes and systemwide consistency over AI-centric software and features. While Liquid Glass is a big change to how your iPhone looks, Apple has drawn from user feedback to finesse the design into feeling less jarring and gelling better when the home screen, Control Center and Notification drop-downs overlap with each other.
There are numerous quality of life improvements, including Messages and Visual Intelligence, in particular. If anything, the AI elephant in the room is the lack of any substantial updates on Siri. After the company talked up advanced Siri interactions over a year ago, I’m still waiting for its assistant to catch up with the likes of Google.
This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/ios-26-beta-preview-liquid-glass-is-better-than-you-think-172155402.html?src=rss https://www.engadget.com/mobile/smartphones/ios-26-beta-preview-liquid-glass-is-better-than-you-think-172155402.html?src=rssВойдите, чтобы добавить комментарий
Другие сообщения в этой группе



Elden Ring Nightreign is finally adding a two-player co-op mode on July 30. Prior to this, FromSof

VSCO, the photo filter and editing app that spawned an entire culture, has


Meta will stop allowing political advertising on its platforms in the European Union as of October 2025, blaming the EU’s new "unworkable" transparency rules for what it called a "difficult decisio