Google said Tuesday that its Cloud TPU v5p, one of the few alternatives to Nvidia’s AI chips, is now available to developers. The AI chip was first announced in December, the same day as its chatbot Gemini. The new TPU, or tensor processing unit, can train large language models almost three times faster than its predecessor, Google’s TPU v4, the company said. Large language models (LLMs) power AI chatbots like ChatGPT. Subscribe to the Daily Brief, our morning email with news and insights you need to understand our changing world. Google’s announcement marks another milestone in Big Tech’s AI arms race. Nvidia is the main supplier of the AI chips known as GPUs, or graphics processing units. Google parent Alphabet is one of Nvidia’s biggest customers, behind Microsoft and Facebook parent Meta. And Google, Microsoft, Amazon, and Meta are all developing their own AI chips. But Nvidia is still important to Google. In the same blog post announcing its latest AI chip, Google mentioned Nvidia 20 times. Right under its detailing of the TPU v5p, the company said it’s updating its A3 supercomputer, which runs on Nvidia GPUs. And Google reminded users that it’s using Nvidia’s latest chip, the Blackwell, in its AI Hypercomputer. Google also announced an Arm-based central processing …
Google unveils new TPU v5p chip
More from this author
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development