LogIn
I don't have account.

Meta Rolls Out In-House AI Chips Weeks After Massive Nvidia and AMD Deals

Meta Platforms has unveiled four new in-house artificial intelligence chips as part of its Meta Training and Inference Accelerator (MTIA) program to expand AI data center capabilities. The company has already deployed the MTIA 300 chip for training smaller AI models that power recommendation systems, while upcoming chips MTIA 400, MTIA 450 and MTIA 500 will focus on generative AI inference tasks such as image and video generation. Manufactured by Taiwan Semiconductor Manufacturing Company, the chips are expected to roll out gradually through 2027 as Meta accelerates development of custom silicon to improve efficiency and reduce reliance on external GPU suppliers like Nvidia and AMD.

3 min read
31 Views
AI Generated Image

Key Highlights

  • Meta unveiled four in-house AI chips designed for artificial intelligence workloads.
  • The chips are part of the Meta Training and Inference Accelerator (MTIA) family.
  • The MTIA 300 chip has already been deployed.
  • MTIA 400, MTIA 450 and MTIA 500 are expected to roll out gradually through 2027.
  • The move comes weeks after Meta signed major GPU deals with Nvidia and AMD for AI infrastructure.

Meta Introduces New In-House AI Chips for Data Centers

Meta has revealed four custom-built chips designed to support artificial intelligence workloads as part of the company’s large-scale expansion of AI infrastructure and data centers. The chips belong to the Meta Training and Inference Accelerator (MTIA) family, which was first introduced publicly in 2023 and updated with a second-generation design in 2024.

The first of the newly announced chips, MTIA 300, was deployed several weeks ago. It is intended to train smaller AI models that power core ranking and recommendation systems used across Meta’s platforms, including content and advertising recommendations.

Upcoming Chips Target Generative AI Inference

Meta also announced three additional chips — MTIA 400, MTIA 450 and MTIA 500 which are designed to support more advanced inference tasks related to generative artificial intelligence.

These workloads include generating images or videos from user prompts and running AI services once models have been trained. According to Meta executives, the chips are not intended to train large language models but instead focus on inference tasks.

The company stated that MTIA 400 has completed its testing phase and is expected to be deployed in data centers soon, while MTIA 450 and MTIA 500 are scheduled to become operational by 2027.

Faster Development Cycle for AI Hardware

Meta plans to release a new chip roughly every six months as it rapidly expands computing capacity for artificial intelligence workloads. The faster development cycle reflects the pace of the industry’s AI infrastructure build-out.

The chips are expected to remain operational for more than five years, according to company executives involved in the program.

Reducing Dependence on External Chip Suppliers

By designing custom silicon, Meta aims to improve price-performance efficiency across its data centers while reducing reliance on external chip suppliers.

The strategy also provides greater diversity in the company’s semiconductor supply chain and helps protect it from price fluctuations in the broader chip market.

Although Meta is investing heavily in its own silicon, the company continues to purchase large volumes of GPUs from Nvidia and AMD to support the rapid growth of its artificial intelligence infrastructure.

Custom Chips Built by Taiwan Semiconductor

Meta’s in-house AI chips are manufactured by Taiwan Semiconductor Manufacturing Company (TSMC), which produces many of the world’s most advanced semiconductors.

The development effort involves a large engineering team working on chip design, data center integration and infrastructure optimization.

Part of a Massive AI Infrastructure Expansion

The announcement comes amid Meta’s broader push to expand artificial intelligence infrastructure globally. The company is building major data centers in multiple U.S. states, including Louisiana, Ohio and Indiana.

These facilities are intended to support the growing computing demands of AI systems used across Meta’s products and services.

References

  • Meta rolls out in-house AI chips weeks after massive Nvidia, AMD deals
AI-assisted: This News was created with AI assistance and may contain errors. Report corrections: Contact us.