Meta is serious about the Metaverse, AI and AI Chips
Meta to deploy in-house custom chips this year to power AI drive
Hey Everyone,
Mark Zuckerberg getting grilled by Congress about the impact of his apps on kids was cringe worthy but Facebook, now called Meta has interesting plans around A.I. chips.
While Meta can somehow afford to throw money at the Metaverse, do a dividend and invest in huge amounts of A.I. chips, they are also able to afford to build their own apparently. Meta will develop its own custom AI chip and deploy them at their local data centres to power their AI tools. All of this, including the development and deployment of these new AI tolls could cost as much as $30 billion each year.
Meta Artemis
The new chip, code-named "Artemis," will go into production this year and be used in Meta's data centers for "inference," a fancy term for running AI models.
This is sort of what Sam Altman is trying to do to build a network to build A.I. chips to cut Nvidia’s monopoly out as a bottleneck.
The goal is to reduce reliance on Nvidia chips and control the cost of AI workloads, reports Reuters. Indeed Microsoft also has such plans, but can any of them execute like Nvidia or AMD?
As demand far outstrips supply, the H100 has become highly sought after and extremely expensive, making Nvidia a trillion-dollar company for the first time. Realistically the way they are going they will be a 2 Trillion plus firm soon.
Microsoft has reportedly accelerated its work on codename Athena, a project to build its own AI chips. Obviously Amazon and Google are going after this as well, but how do they catch up with a monopoly specialist like Nvidia? How does China for that matter?
The implementation of Meta’s proprietary chip could lead to substantial savings, potentially reducing annual energy expenses by hundreds of millions of dollars and cutting down on chip purchasing costs by billions. But I didn’t see a timeline that make sense given the current bottleneck.
As demand for generative AI services continues to grow, it’s evident that chips will be the next big battleground for AI supremacy. Or the race to AGI in Apple Vision Pro like products and BCIs. Meta clearly as an eye of all of the above.
The same companies who are the advertising and cloud leaders need to have a moat around A.I. and AI chips as well to be safe from future disruption. Like you perhaps heard, Meta CEO Mark Zuckerberg recently announced that he plans to have 340,000 Nvidia H100 GPUs in use by the end of the year, for a total of approximately 600,000 GPUs running and training AI systems. This makes Meta Nvidia's largest publicly known customer after Microsoft. In compute if not talent, at least Meta will be competitive.
Artemis processor could save Meta hundreds of millions of dollars.
Keep reading with a 7-day free trial
Subscribe to Artificial Intelligence Survey 🤖🏦🧭 to keep reading this post and get 7 days of free access to the full post archives.