FAIR, or Facebook AI Research, has been publishing technology prototypes since late 2021. However, the company was not focused on converting its well-regarded research into products.
As investor interest soars, that is changing. Zuckerberg announced a new top-level generative AI team in February. It would turbocharge the company’s work in the area. Andrew Bosworth, CTO, also said generative AI was the area where he and Zuckerberg were spending the most time.
Two people that know the new team say that it is in the early stages. And that it is all about the foundation model and the core program. Carvill says they have been building it for more than a year now. Until last year, Meta largely ran AI workloads using the company’s fleet of commodity CPUs. It has filled data centers for decades but performs AI work poorly.
They started using an in-house chip for inference AI, where algorithms trained on huge amounts of data make judgments. By 2021, that two-pronged approach proved slower and less efficient than one built around GPUs, which were also more flexible in running different types of models than Meta’s chip.
A key source of trouble can be traced back to Meta’s belated embrace of the graphics processing unit, or GPU, for AI work. The GPU chips are the best for AI because they can do a lot of things at once reducing the time needed to churn through billions. Meta planned to make progress on developing a new, more ambitious chip that can train AI models, and be used for training. But the project didn’t produce the expected results. Meta is now reorganizing AI units, naming two new heads of engineering in the process.