
Justin Sullivan/Getty Images News
Apple (NASDAQ:AAPL) has revealed what it calls the most substantial redesign ever to its operating system, with OS 26 featuring Liquid Glass, which is expected to enhance the navigation experience, the company said today at its Worldwide Developer Conference 2025.
“Today, we are excited to announce our broadest update ever,” said Alan Dye, Apple’s vice president of Human Interface Design. “We are introducing a universal design across all our products. Meticulously crafted by rethinking the fundamental elements that make up our software, the new design features an entirely new material called Liquid Glass. It combines the optical qualities of glass with a fluidity only Apple can achieve, as it transforms depending on your content or context.”
“Our new design blurs the lines between hardware and software,” he added. “This sets the stage for the next era of our products.”
The new design will be released across all Apple hardware, including the iPhone, iPad, Watch, TV and Mac.
Apple is also updating a swath of its native apps, including Camera, Safari, Maps, CarPlay, Messages, Phone, Wallet, IDs and Gaming.
Its Image Playground and Genmoji are also receiving a boost from OpenAI’s ChatGPT, which can be used to create new images and emojis.
“Image Playground sends a user’s description or photo to ChatGPT and creates a unique image,” Apple said. “Users are always in control, and nothing is shared with ChatGPT without their permission.”
Apple is also adding Visual Intelligence, which allow users to learn more about the objects on the screen. Users can ask ChatGPT to provide information on people, places and objects currently on the screen without having to jump between apps.
Apple introduced Live Translations as well. It’s integrated into Messages, Facetime and Phone to provide real-time language translation. It also works with someone who is not using an iPhone.
“We are working to make Siri more personable,” said Craig Federighi, Apple’s senior vice president of Software Engineering. “We are expanding languages for Apple Intelligence this year. We’re opening up access for any app to tap into Apple Intelligence through our Foundation Models Framework.”