Skip to main content

The iPhone is about to get smarter, in part thanks to Google chips

Apple Intelligence just arrived in beta this week, and now the company has published an in-depth overview of how some of its AI features were created. A key tidbit? Two of Apple’s foundation models were created using Google-made chips.

The first in-depth look at Apple Intelligence’s development

Apple tends to shy away from sharing many details on its inner practices of product development. However, with AI and ML features, the company has long published its research for all to see.

The latest publication is titled ‘Apple Intelligence Foundation Language Models,’ and it’s one of the first papers since WWDC’s introduction of Apple Intelligence.

The document is written by researchers and for researchers, so it’s not the easiest to parse. However, a standout tidbit involves the chips used to train two of the Apple Intelligence language models.

In this report we will detail how two of these models—AFM-on-device (AFM stands for Apple Foundation Model), a 3 billion parameter language model, and AFM-server, a larger server-based language model—have been built and adapted to perform specialized tasks efficiently, accurately, and responsibly. These two foundation models are part of a larger family of generative models created by Apple to support users and developers; this includes a coding model (based on an AFM language model) to build intelligence into Xcode, as well as a diffusion model to help users express themselves visually, for example, in the Messages app.

Using Google chips to train Apple models

The two models mentioned, AFM-on-device and AFM-server, were not trained using Apple’s in-house Apple Silicon chips.

Instead, Apple turned to Google Tensor chips to train its models. And according to the paper, it took a whole lot of Tensor chips to do the work.

  • AFM-on-device required 2,048 TPUv5p chips in its training
  • The larger AFM-server model required 8,192 TPUv4 chips

It is interesting that Apple went with Google’s Tensor chips rather than the Nvidia chips that other companies tend to rely on.

The paper doesn’t get into an explanation of that, but perhaps future publications will.

What should we make of this news? Without further details from Apple, it’s hard to know what to think.

Knowing Apple’s preference for doing as much work in-house as possible, it’s very possible the company has already moved on from Google Tensor for model training and is using an advanced version of Apple Silicon.

In any case, what this does mean is that the iPhone getting a lot smarter with Apple Intelligence is, in part, thanks to Google.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ryan Christoffel Ryan Christoffel

Ryan got his start in journalism as an Editor at MacStories, where he worked for four years covering Apple news, writing app reviews, and more. For two years he co-hosted the Adapt podcast on Relay FM, which focused entirely on the iPad. As a result, it should come as no surprise that his favorite Apple device is the iPad Pro.

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
  翻译: