Meta reveals four proprietary AI accelerators, developed internally for its own use. The chips require up to 512 GB of HBM each.
Meta is showing off four self-developed AI accelerators. These MTIA chips build on the MTIA 100 and MTIA 200 that Meta previously unveiled. Those chips were primarily suitable for running ranking and recommendation models. These are the algorithms that, among other things, make users’ Instagram feeds optimally personalized (read: addictive and profitable for Meta).
More than R&R
Meta is now building on that foundation with four new chips. The MTIA 300 is the direct successor to the previous processors. This accelerator is also optimized to run and train those rank & recommendation (R&R) models.
The MTIA 400 is a more powerful variant, developed for a broader range of applications. This chip can handle not only R&R models but also more general generative AI workloads. Following testing, Meta is currently rolling out the MTIA in its own data centers.
Another step up the ladder is the MTIA 450. It is specifically optimized for gen AI, rather than R&R. Meta primarily intends to support internal inference with this. This chip will arrive in data centers in 2027.
In that same year, Meta hopes to roll out the MTIA 500. This is the most powerful self-developed AI chip the company is currently communicating about. The processor is expected to be about 50 percent faster than the MTIA 450.
Memory hogs
The unveiling shows how Meta intends to support its AI development with proprietary hardware, in combination with chips from Nvidia and others. The MTIA chips also make it clear where all that RAM is going.
The MTIA 400, which is already being rolled out today and primarily serves to personalize Meta users’ feeds, contains 216 GB of HBM per chip. The top-tier MTIA 500 model has up to 512 GB of HBM on board.
With the unveiling of these chips, Meta wants to highlight its expertise in AI hardware. However, the details about the accelerators—and especially the HBM memory—primarily make it clear why your next laptop will be so much more expensive. Meta speaks of scaling AI experiences for billions of people, but that is a euphemism. In practice, the MTIA chips are largely about AI and R&R designed to keep people glued to their feeds and serve them the right advertisements.
The social value of these HBM-devouring chips is therefore at least open to debate, while the impact on the prices of laptops and smartphones for end users is very real.
