Google In Talks With Marvell To Build New AI Chips As Inference Costs Surge
Summary
Google is in talks with Marvell Technology to develop two new AI chips targeting surging inference costs, expanding its custom silicon supply chain beyond existing partners Broadcom and MediaTek as AI query demands increasingly outpace training as the dominant compute expense for hyperscalers.
Key Points
- Google is in talks with Marvell Technology to develop two new AI chips — a memory processing unit and an inference-optimized TPU — adding a third design partner to its custom silicon supply chain alongside Broadcom and MediaTek.
- The discussions have not yet produced a signed contract, but follow Broadcom locking in a through-2031 TPU agreement, signaling Google is diversifying its chip suppliers rather than replacing any single partner.
- The move reflects a broader industry shift where AI inference — serving billions of daily user queries — is overtaking training as the dominant compute cost, making purpose-built inference silicon a critical competitive advantage for hyperscalers like Google.