trusted form Apple Uses Google’s TPUs for AI, Sidestepping Nvidia | Several.com
Although we earn commissions from partners, we ensure unbiased evaluations. More on our 'How We Work' page
Google Tpus Fuel Apples Ai Models

Google TPUs Fuel Apple’s AI Models

Google TPUs Fuel Apple’s AI ModelsGoogle TPUs Fuel Apple’s AI Models
Apple uses Google TPUs for new AI models

Published On: July 30, 2024

Apple's reliance on Google's Tensor Processing Units (TPUs) to train their artificial intelligence (AI) models has been detailed in a new research paper, highlighting a significant collaboration between two of the tech industry's giants. The study reveals that Apple used Google's chips to develop two critical components of their AI software infrastructure, which will support a suite of forthcoming AI tools and features.

Apple's decision to utilize Google's TPUs instead of Nvidia's industry-leading graphics processing units (GPUs) is particularly noteworthy. Nvidia, which holds an 80% market share in AI processors, is renowned for their GPUs used widely across various AI applications. However, Apple's research paper made no mention of Nvidia hardware, focusing solely on the Google-designed TPUs​.

Apple's AI models were trained using two specific types of Google's TPUs. For the AI model that will run on consumer devices like iPhones, Apple deployed 2,048 TPUv5p chips. For their server-based AI model, 8,192 TPUv4 processors were used. This massive deployment underscores the scale and complexity of Apple's AI training efforts.

This collaboration highlights a strategic move by Apple to leverage Google's cloud infrastructure, which offers access to TPUs via the Google Cloud Platform. Unlike Nvidia, which sells their GPUs and systems as standalone products, Google's TPUs are accessible only through their cloud platform, requiring users to build and run their software within this ecosystem. This approach allows Apple to tap into the powerful computational resources offered by Google's TPUs, optimizing the training of their AI models for better performance and efficiency​.

Industry context

The decision to use Google's TPUs instead of Nvidia's GPUs may also reflect broader trends in the AI and tech industry. While Nvidia's GPUs are widely recognized for their performance in AI tasks, TPUs are designed specifically for machine learning applications, offering significant advantages in processing speed and power efficiency for certain types of workloads. By opting for Google's TPUs, Apple might be aiming to achieve specific performance metrics that align more closely with their AI development goals.

Apple's engineers indicated in the research paper that Google's chips have the potential to create even larger and more sophisticated AI models. This suggests that the current models discussed are just the beginning, with future iterations likely to leverage the full capabilities of Google's TPUs for more advanced AI applications.

Competitive edge and innovation

Apple's move to use Google's TPUs may spur further innovation and competition in the AI hardware market. Companies like Amazon, through their AWS Inferentia and Trainium chips, and other cloud providers are also investing heavily in developing specialized AI processors. This competitive environment could lead to rapid advancements in AI hardware, ultimately benefiting the entire industry by providing more options and driving down costs.

Related Topics

Recent Posts