Eplly is Your Ultimate Source for the Latest News, Science, Health, Fashion, Education, Family, Music and Movies.
—— 《 Eplly • Com 》
Meta’s Pricey Bet on AI Comes With New Custom Chips, Coder Tools
Views: 3014
2023-05-19 00:29
In Meta Platforms Inc.’s year of cost cutting and layoffs, there’s one area that’s seeing record spending: an

In Meta Platforms Inc.’s year of cost cutting and layoffs, there’s one area that’s seeing record spending: an update of the social media giant’s infrastructure to keep pace in the artificial intelligence arms race.

On Thursday, the Facebook owner unveiled a slew of new technologies, including a new chip developed in-house to help train AI faster, and a tool that helps coders get suggestions for how to build their products. The company is also revamping its data centers to make it easier to deploy AI technology.

“This work reflects long term efforts that will enable even more advances and better use of this technology across everything we do,” Chief Executive Officer Mark Zuckerberg said in an emailed statement.

The custom accelerator chip will help speed up the recommendation algorithm that powers what people see on Facebook and Instagram. A new data center design is being rolled out specifically for hardware that’s best for AI. Meta said it has also finished the second phase of building its AI supercomputer to train large language models, which are technologies similar to those that power ChatGPT.

Meta’s capital expenditures hit a record $31.4 billion last year, more than four-and-a-half times the amount in 2017. This year, which Zuckerberg has called Meta’s “year of efficiency,” analysts expect a repeat of 2022’s levels, with many of those dollars going toward improving and expanding AI infrastructure.

“There is a little bit of tension” with the efficiency mandate, “but it’s not in direct competition to be investing in AI and also investing in efficiency,” said Kim Hazelwood, director of AI research at Meta.

Some of the AI updates are obvious drivers of efficiency within Meta, which has eliminated thousands of employees in recent months.

CodeCompose is a new generative AI-based tool for developers that can auto-complete or suggest changes to code. So far, 5,200 coders are using it in-house, accepting 22% of the suggestions it makes for code completion, the company said.

The company has been increasingly looking to AI to solve its biggest business problems. For advertisers that have been frustrated by privacy changes from Apple Inc. that made their digital ads harder to target, Meta plans to use AI to make some better guesses about user interests. To compete with TikTok, Facebook and Instagram are starting to show content from people users don’t follow — something that requires an algorithm to guess what they may be interested in.

Investors are going to be looking for direct proof of those improvements to justify the deep spending, Angelo Zino, an analyst at CFRA Research, said in an interview.

“It’s clearly going to take some time for some of this stuff to really play itself out,” Zino said of Meta’s increase in capex spending generally. “There’s going to be a lot of scrutiny, making sure that they can start seeing an acceleration in some of those returns on the revenue side.”

When AI models are queried, they spit out answers, called inferences, that require a specific type of processing. Meta decided to develop new chips, called Meta Training and Inference Accelerator (MTIA), to help do the specific work in-house, complementing its slew of graphics processing units from Nvidia.

Meta hopes its MTIA chips will help the company spin up more accurate and interesting predictions of what types of original and ad content users see, hopefully leading to people spending more time on the apps and clicking on more advertisements.

The company also launched its first in-house built, application-specific integrated circuit – or ASIC – designed for processing videos and live streaming. Already, users on Facebook and Instagram share more than 2 billion short videos a day, and this new processor can help these videos show up faster using less data on any device a person may be watching.

“We’ve been able to optimize and balance and target our first chips for our recommender models,” said Alexis Bjorlin, vice president of hardware engineering. “We also have all the visibility on what the different needs are for generative-AI workloads or any different element that comes down the pipe.”

While the recommendation engines used on Meta’s social media apps are its current version of AI technology, key to future generative-AI work is the company’s AI supercomputer, called the Research SuperCluster, which the company will use to train large sets of artificial intelligence programs, called models.

On Thursday, the company said it had completed its second phase of its build-out, which trains its large language model called LLaMA and will be a key part of its efforts to build the metaverse — a virtual reality platform for which the company was renamed from Facebook.

Meta has long been committed to making some of its sophisticated tech available to the outside community. While much of this hardware in its stack isn’t, some of the work that it powers will be open source. LLaMA is shared with researchers, as is an AI model trained on its supercomputer that can solve ten International Math Olympiad problems. CodeCompose was built on public revelations shared by Meta’s AI research team. And its new inference chip will help the company continue to support PyTorch, the open source AI framework that Meta created and then shifted to the Linux Foundation to give it more independence.

Although Meta has been working on AI tools for years, Zuckerberg chose to frame his company’s future around a virtual reality vision that was even more nebulous. That pivot has faced sharp investor scrutiny, so the deep investment in AI infrastructure could help rebuild confidence in Zuckerberg’s overall strategy, said Scott Kessler, analyst at investment researcher Third Bridge.

“They don’t want to be an also-ran” when it comes to the industry-wide race to infuse AI into businesses, Kessler said. “A lot more people are going to kind of buy into that narrative now than say, six and nine months ago.”