New Apple Intelligence features are available today
Apple today released new Apple Intelligence features that elevate the user experience across iPhone, iPad, Mac, Apple Watch, and Apple Vision Pro. Deeply integrated across operating systems, today’s new features help users seamlessly communicate across languages with Live Translation; use visual intelligence to learn more about the onscreen content across their apps;1 and express themselves with enhancements to Genmoji and Image Playground.2 Additionally, Shortcuts can now tap into Apple Intelligence models directly to accelerate workflows, and developers are starting to take advantage of the on-device foundational model at the core of Apple Intelligence to build intelligent, privacy-protected features into their apps.
Users with an Apple Intelligence-enabled device can start experiencing new features today with iOS 26, iPadOS 26, macOS Tahoe 26, watchOS 26, and visionOS 26. Additionally, Apple Intelligence features will be coming soon to eight more languages: Danish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Chinese (traditional), and Vietnamese.
Break Down Language Barriers with Live Translation
For those moments when a language barrier gets in the way, Live Translation can help users communicate across select languages when messaging or speaking. The feature is seamlessly integrated into Messages, FaceTime, and Phone, and users can also access it with AirPods Pro 3 to translate in-person conversations, enabled by Apple Intelligence on iPhone.3
With this powerful new feature, users can translate all types of conversations — whether over the phone or while using FaceTime, in person, or asynchronously in Messages — right when they need to. It protects user’s privacy with on-device processing, so personal conversations stay personal.
In Messages, Live Translation can automatically translate a user’s response as they type and deliver it in the recipient’s language.4 During FaceTime calls, users can follow along with live translated captions while still hearing their friend or family member’s voice. And during a phone call, the translation is spoken out loud in real time.5 Users can access Live Translation on AirPods with an all-new gesture by simultaneously pressing both AirPods stems; by saying “Siri, start Live Translation;” or even from the Action button on iPhone, which will allow them to hear in their preferred language. Active Noise Cancellation (ANC) lowers the volume of the person speaking so it’s easier to focus on the translation.
And by the end of the year, Live Translation for Phone, FaceTime, and AirPods will expand language support to include Italian, Japanese, Korean, Chinese (Mandarin, simplified), and Chinese (Mandarin, traditional).
Posted on: 9/17/2025 1:16:33 PM
|