Meta Is Ready to Rock Nvidia’s Boat With Its In-House AI Chip
March 11, 2025
The adage goes, “your arbitrage is my opportunity,” and could be used to sum up Meta’s push into building an in-house chip for AI training tasks. Reuters reports the company recently began a small deployment of the chips after successfully building them in a test with Taiwan’s TSMC (sorry Intel). Meta is already using its chips for inference or tailoring content to specific users after the AI model has already been developed and trained. It wants to use them for training models by 2026.
From the article:
The push to develop in-house chips is part of a long-term plan at Meta to bring down its mammoth infrastructure costs as the company places expensive bets on AI tools to drive growth.
Meta, which also owns Instagram and WhatsApp, has forecast total 2025 expenses of $114 billion to $119 billion, including up to $65 billion in capital expenditure largely driven by spending on AI infrastructure.
One of the sources said Meta’s new training chip is a dedicated accelerator, meaning it is designed to handle only AI-specific tasks. This can make it more power-efficient than the integrated graphics processing units (GPUs) generally used for AI workloads.
Even if consumer applications of generative AI, like chatbots, end up being an overhyped bubble, Meta can deploy the technology to improve content recommendations and ad targeting. The vast majority of Meta’s revenue comes from advertising, and even small improvements in targeting capabilities can produce billions of dollars in new revenue as advertisers see better results.
Despite some flops and lackluster results from the Reality Labs division, Meta has managed to build out strong hardware teams over the years and has seen some success with its Ray-Ban AI glasses. However, executives have warned teams internally that their hardware efforts still have not had the world-changing impact they are hoping for. Meta’s VR headsets sell in the low millions annually. CEO Mark Zuckerberg has long sought to build out its own hardware platforms so it can reduce its reliance on Apple and Google.
Major tech companies have paid billions of dollars to Nvidia since 2022 in order to stock up on its much sought-after GPUs that have become the industry standard for AI processing. While the company has some competitors, like AMD, Nvidia has been lauded for offering not just chips themselves but the CUDA software toolkit for developing AI applications on them.
Late last year, Nvidia reported that nearly 50% of its revenue in one quarter came from just four companies. All of these companies have sought to build chips so they can cut out the middleman and drive down costs, and they can wait many years for a return. There is only so long that investors will tolerate heavy spending before they demand that Meta show it is paying off. Amazon has its own Inferentia chips, while Google has been developing the Tensor Processing Units (TPUs) for years.
Nvidia’s concentration in just a few customers that are building their own processors, along with the rise of efficient AI models like China’s DeepSeek, have raised some concerns about whether Nvidia can keep up its growth forever, though CEO Jensen Huang has said he is optimistic that data center providers will spend $1 trillion over the next five years building out infrastructure, which could see his company continue to grow into the 2030s. And, of course, most companies will not be able to develop chips like Meta can.
Search
RECENT PRESS RELEASES
Related Post