World News

Meta builds a 1700W superchip and custom MTIA chips while ditching Nvidia, AMD, Intel, and ARM for inference

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c


  • Meta’s 1700W superchip delivers 30 PFLOPs and 512GB of HBM memory
  • MTIA 450 and 500 prioritize inference over pre-training workloads
  • Future MTIA generations will support GenAI inference and ranking workloads

Meta is advancing its AI infrastructure with a portfolio of custom MTIA chips designed specifically for inference workloads across its apps.

The company is developing a 1700W superchip capable of 30 PFLOPs and 512GB of HBM, integrated within the same MTIA infrastructure to handle inference tasks at scale.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button