The 5 big Apple Intelligence updates you can try right now in the iOS 26 public beta

Visual Intelligence in iOS 26 Beta
(Image credit: Jacob Krol/Future)

Apple's now released the public betas for all its latest software coming later this year, which you can now experience all that iOS 26 has to offer.

While the headline feature will be Liquid Glass, a complete redesign of the iPhone user experience, installing iOS 26 will also give you access to some new Apple Intelligence that might improve your life in subtle, yet meaningful, ways.

I've whittled down the list of new AI-powered features coming in iOS 26 (and iPadOS 26) and selected my five favorites. From in-app translation to new Apple Intelligence abilities in Shortcuts, there are plenty of reasons to get excited for the iOS 26 public beta.

1. Live Translation

Apple Facetime live translation

(Image credit: Apple)

After using iOS 26 developer beta for over a month, I've found Live Translation to be my personal favorite new Apple Intelligence feature.

Built into Messages, FaceTime, and the Phone app, Live Translation lets you automatically translate messages, add translated live captions to FaceTime, and, on a phone call, the translation will be spoken aloud throughout the conversation, completely removing language barriers using AI.

I've written about how Live Translation has drastically improved my ability to communicate with my Italian in-laws, and trust me, if you regularly speak multiple languages, you'll also find this new Apple Intelligence feature to be a game-changer.

2. Genmoji and Image Playground upgrades

New messages features from WWDC 2025

(Image credit: Lance Ulanoff / Future)

Apple launched Genmoji and Image Playground as part of the first wave of Apple Intelligence features, and now the company has improved its generative AI image tools.

Users can now turn text descriptions into emojis as well as mix emojis and combine them with descriptions to create something new. You can also change expressions and adjust personal attributes of Genmojis made from photos of friends and family members.

Image Playground has also been given ChatGPT support to allow users to access brand-new styles such as oil painting and vector art. Apple says, "users are always in control, and nothing is shared with ChatGPT without their permission."

I've enjoyed my time using ChatGPT in Image Playground. While it's still not as good as some of the other best AI image generators out there, it improves the Image Playground experience and is a step in the right direction for Apple's creative AI tool.

3. Visual Intelligence can now see your screen

Using Visual Intelligence to add to calendar in iOS 26

(Image credit: Jacob Krol/Future)

Visual Intelligence might've already been the best Apple Intelligence feature, but now the exclusive iPhone 16 AI tool is even better.

In iOS 26, Visual Intelligence can now scan your screen, allowing users to search and take action on anything they’re viewing across apps.

You can ask ChatGPT questions about content on your screen via Apple Intelligence, and this new feature can be accessed by taking a screenshot. When using the same buttons as a screenshot in iOS 26, you are now asked to save, share the screenshot, or explore more with Visual Intelligence.

If you're like me and take screenshots regularly to remember information, Visual Intelligence on iOS 26 could be the Apple Intelligence feature you've been waiting for.

4. Third party apps have Apple Intelligence access

WWDC 2025 Apple Visual Intelligence

(Image credit: Apple)

While this entry isn't a feature per se, this iOS 26 addition is a big one for the future of Apple Intelligence: Developers now have access to Apple's Foundation Models.

What does that mean exactly? Well, app developers can now "build on Apple Intelligence to bring users new experiences that are intelligent, available when they’re offline, and that protect their privacy, using AI inference that is free of cost."

Apple showcased an example at WWDC 2025 of an education app using the Apple Intelligence model to generate a quiz from your notes, without any API costs.

This framework could completely change the way we, users, interact with our favorite third-party apps, now with the ability to tap into Apple's AI models and make the user experience even more intuitive.

5. AI-powered Shortcuts

Image showing how you Open shortcuts and add a new shortcut

(Image credit: Future)

Last but not least, Apple Intelligence is now available in the Shortcuts app. This is a major upgrade to one of the best apps on Apple devices, allowing users to "tap into intelligent actions, a whole new set of shortcuts enabled by Apple Intelligence."

I've tried Apple Intelligence-powered shortcuts, and just like the Shortcuts app, the true power here will come down to user creations and how people tap into this new ability. As someone who uses Shortcuts daily, I'm incredibly excited to see how the fantastic community of people who create powerful shortcuts and share them online will tap into Apple Intelligence's capabilities.

This is not an AI improvement everyone is going to use, but if you choose to delve into the world of the Shortcuts app and learn how to get the most from it, this new iOS 26 addition might be the best of the lot.

You might also like

TOPICS
John-Anthony Disotto
Senior Writer AI

John-Anthony Disotto is TechRadar's Senior Writer, AI, bringing you the latest news on, and comprehensive coverage of, tech's biggest buzzword. An expert on all things Apple, he was previously iMore's How To Editor, and has a monthly column in MacFormat. John-Anthony has used the Apple ecosystem for over a decade, and is an award-winning journalist with years of experience in editorial.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.