ARM is set to create its own datacentre processors to help Meta mitigate costs and energy use associated with AI workload processing on GPUs. Meta regards AI as a vital component across its platforms, aiming to provide a highly intelligent and personalized AI assistant, Meta AI, to a billion users. This involves transitioning to custom silicon chips tailored to their specific needs. With the Meta Training and Inference Accelerator (MTIA), the company plans significant adoption in 2024 to enhance efficiency and replace existing GPU servers over the next few years.
Meta sees AI as a strategic technology initiative that spans its platforms, including Facebook, Instagram and WhatApp.
Meta began adopting MTIA in the first half of 2024 for core ranking and recommendations inference.
Collection
[
|
...
]