Brief review of WWDC25: System innovation, AI pragmatism

At WWDC25 (Apple Worldwide Developers Conference), which is highly regarded in the tech industry, Apple is once again in the spotlight. This year’s conference not only brought a new operating system design language, but also demonstrated a pragmatic shift in the field of AI. This article will provide an in-depth analysis of Apple’s two core highlights at WWDC25: innovation in system design and pragmatic implementation of AI technology.

As previously rumored, at this year’s WWDC, Apple came up with an operating system with a new design language and achieved the unification of the design language of the whole operating system; At the AI level, it seems to have changed its development strategy, from pursuing system-level AI experience to taking application scenarios as the foothold.

For this WWDC, I personally think it may be summarized as:: System innovation, AI pragmatism.

1. Systematically seek innovation and prepare for the next stage of products

After twelve years, Apple has finally made a new attempt at the operating system design language, launching a new design language called Liquid Glass on all its platforms. This is Apple’s largest design update to date, and it also brings translucent, glossy and dynamically responsive visuals to Apple’s ecological user interface, which Apple says is “smart, pleasing to the eye, but still familiar”.

The reason why Apple has upgraded from the previous flat design language to the fluid glass design language is that the performance of Apple’s various processors has been greatly improved to provide a computing power foundation for the flexible texture experience of fluid glass, but its ambition is to aim for the future and prepare software in advance for the next stage of products.

To achieve these three challenges, product managers will only continue to appreciate
Good product managers are very scarce, and product managers who understand users, business, and data are still in demand when they go out of the Internet. On the contrary, if you only do simple communication, inefficient execution, and shallow thinking, I am afraid that you will not be able to go through the torrent of the next 3-5 years.

View details >

As Apple said at the press conference: The new design language also lays the foundation for Apple’s next stage of products and daily interactive experience.

In fact, there have been rumors in the industry that the system design language upgrade is Apple’s preparation for the 20th anniversary of the release of the iPhone, and it is reported that Apple may release an all-glass iPhone at that time.

Of course, aside from the long-term planning, the implementation of the iOS 26 system itself, in fact, from the first developer preview version: I personally think that the fluid glass design of iOS 26 has greatly improved the agility of the system, and its real-time rendering ability makes people feel like they are really holding a piece of transparent glass in their hands and moving around on various interactive interfaces.

Of course, as the first developer preview version, although iOS 26 is already available, Apple also needs to further optimize the fluid glass design language in the next time to balance its visual interference and information readability problems in the multi-level complex context, especially in the current user feedback is more obvious password input, control center and other interfaces, in addition, the developer preview version has problems such as high heat generation, application adaptability, and poor fluency, which also need to be further optimized by Apple.

But on the whole, I personally think that the fluid glass design language brought by Apple on iOS 26 this time is not hidden, it represents Apple’s determination and courage to break through, and also shows Apple’s bold thinking about the future system interaction experience.

For players like Apple, who have always taken one step and looked at three steps, their decisive abandonment of flattening and turning to fluid glass this time must have been planned for the evolution path of Apple products such as the iPhone in the next stage, and we can maintain expectations for this!

2. AI is pragmatic and serves users in high-frequency and perceptible scenarios

If Apple is drastic in system design this time, then in AI technology, Apple can be said to have shifted from ambition to pragmatic experience this time.

Last year, Apple released the ambitious Apple Intelligence at WWDC, and although many features have been implemented in the past year, its current performance is still difficult to be optimistic if you want to evaluate Apple Intelligence as a whole, especially the Siri-related capabilities it demonstrated at that time, which was subsequently considered by the market to be only in the demonstration effect.

This gap in reality is actually a serious blow to Apple’s image in the AI era. At present, Apple also seems to have admitted this phased failure, and Craig Federighi, Apple’s senior vice president of software engineering, also bluntly said in his speech: Siri-related updates need to wait until next year.

This year, Apple’s focus on AI capabilities is more on high-frequency scenarios that are less difficult but at the same time more useful for user experience – such as translation simultaneous interpretation function, spatial photos, call assistants, screen screenshot recognition, information extraction, charging time prediction, app power consumption analysis, etc.

Indeed, there is a huge gap between these AI functions and the system-level Apple Intelligence demonstrated by Apple last year, but these AI capabilities actually play a high-frequency and perceptible role in the daily use of current users, and can really serve the needs of more users.

It is also a new path and a more pragmatic choice to first do a good job in the implementation of high-frequency and perceptible AI functions for users, and then step by step towards the system-level AI experience described by Apple Intelligence. This is especially true for the Chinese market.

Putting aside the above-mentioned dessert-style AI functions, in fact, there was no action at this WWDC – its move to open the Foundation Models framework to developers is particularly noteworthy.

With this move, Apple opens up the local model to all third-party application developers, allowing developers to call relevant capabilities in their applications with just a few lines of code to create smarter applications, reduce developer development costs, protect user privacy, and run offline-side AI experiences.

Relying on the market base of Apple devices, the device-side computing power and the number of developer groups, then Apple’s opening of device-side computing power to developers this time may also allow the Apple ecosystem to have more applications with AI capabilities faster, and then allow Apple to occupy the AI application ecological highland in advance.

In the final analysis, the market competition of smart devices is to land in the prosperity competition of the software ecosystem, so Apple’s move can be seen as a first move.

Written at the end

At this WWDC, Apple has brought long-lost changes in system design, and the design language of each operating platform with fluid glass design fully demonstrates Apple’s “excessive” design engineering capabilities, and also allows us to see Apple’s great courage to break the comfort zone, innovate itself, and explore the future. This is the surprise brought to us by this WWDC!

However, Apple’s slow action in AI technology is a regret of this WWDC – although Apple still shows a lot of AI applications and empowers all developers through the basic model framework, all this still cannot hide the current dilemma of Apple Intelligence, and the slow step of AI technology has also made many people worry about whether Apple can continue to stand at the forefront of the AI tide?

End of text
 0