Meta unveils new in-house chips to support AI workloads

March 11, 2026

Meta unveils new in-house chips to support AI workloads
Meta unveils new in-house chips to support AI workloads Proactive uses images sourced from Shutterstock

Meta Platforms Inc (NASDAQ:META, XETRA:FB2A, SIX:FB) on Wednesday introduced four new in-house chips designed to support artificial intelligence workloads, part of the company’s broader effort to expand data center capacity and reduce reliance on third-party hardware.

The chips belong to Meta’s Meta Training and Inference Accelerator (MTIA) family, a line of custom silicon the company first revealed in 2023 and updated with a second generation in 2024.

The first of the newly announced processors, MTIA 300, was deployed several weeks ago.

According to Meta, the chip is designed to train smaller AI models that power ranking and recommendation systems across its platforms, including Facebook and Instagram. These systems help determine which content and advertisements users see in their feeds.

Meta also outlined plans for three additional chips, MTIA 400, MTIA 450 and MTIA 500, which are aimed at more advanced generative AI inference tasks. Those workloads include creating images or videos based on user prompts. The chips are not intended for training large-scale language models, the company said.

In a blog post describing its roadmap, Meta said recent and planned MTIA generations are intended to improve generative AI inference performance while also supporting ranking and recommendation training.

The company added that the architecture uses a modular, multi-chiplet design that is co-developed with its software stack, allowing performance improvements while maintaining compatibility across systems.

Shares of Meta edged down 0.6% at about $650 following the announcement.

  

Search

RECENT PRESS RELEASES