It has been rumored that Apple will integrate all the advances in artificial intelligence in all its new operating systems at this WWDC24 and that is what it has been. They have done it through the concept Apple Intelligence, a set of functions and a concept in itself that helps the user in their daily life understanding user context to communicate, work and develop personally. All of this guaranteeing user privacy through execution on the device itself and on highly protected servers.
Apple Intelligence, the bundles of Apple AI features
Apple Intelligence in all its essence is made up of a large number of large processing systems that relies on personal context to offer quality, useful and personal content. It is integrated into applications and allows its management to communicate, work and develop personally.
Regarding its capabilities, it is capable of understand language, images, actions within applications and one's own personal context. Within the language, Apple ensures that Apple Intelligence has a deep understanding of natural language so that handling is quick and easy. Thanks to this we can write large paragraphs giving an idea or rewrite a paragraph solving spelling problems in a large number of applications.
As for images, this technology is capable of understanding and analyzing static images, emojis or GIFs. Thanks to generative AI, you can create images directly with your AI models, for example creating an image in the shape of cartoon from a friend (extracting their image from our photos) with a personalized context. All of these experiences are integrated into all applications. Besides, They also analyze how we use apps and what we interact with them. For example with requests like "play me the podcast that my wife sent me the other day" or "show me the attachments that Juan sent me by email the other day."
Finally, Apple Intelligence will understand the personal context of each user analyzing the content of the applications and making our connection with this technology unique, guaranteeing that privacy is the fundamental axis that Apple prioritizes in this entire structure.
The cornerstone of Apple Intelligence is monitoring and operation within the device, thanks to the integration between hardware and software, thanks to the entire new range of Apple Silicon chips from the A17 Pro and the M1 onwards.
Apple has called semantic index to technology that is capable of organizing and highlighting information from all applications. They are then able to insert that content into Apple Silicon's large language models and monitor a response. This is achieved thanks to an integration of the hybrid information between the device itself and Apple servers that use the new Private Cloud Compute, which allows you to scale the computing capacity of your system on the servers while still protecting our privacy. These servers, as has been rumored, will be built with Apple Silicon themselves and programmed with Xcode.
When we make a request, Apple Intelligence will decide whether to execute the action within the device or if it escalates the request to the server with three premises: no information will be stored, it will only be used for our requests and privacy can be demonstrated by independent experts.