Hey Everyone,
Elon Muskās X.AI did the expected: Grok was initially released as a proprietary or āclosed sourceā model back in November 2023 and it was, until now, accessible only on Muskās separate but related social network X, a costly subscription to say the least.
š Breaking News: xAI has just released Grok-1, a 314B MoE model!
As did Open-Sora 1.0.
Start exploring Grok-1 now:
Letās try to wrap our heads around this.
Elon Muskās xAI has open-sourced the base code of Grok AI model, but without any training code. The company described it as the ā314 billion parameter Mixture-of-Expert modelā on GitHub.
314B, mixture of expert (2 out of 8 active). Even the active parameters only (86B) is more than the biggest Llama.
They say that Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI. Itās not clear what they mean by trained from scratch. All of these models are based on available data.
Is it Really Open-Source?
Model or training details were not open: In a blog post, xAI said that the model wasnāt tuned for any particular application such as using it for conversations. The company noted that Grok-1 was trained on a ācustomā stack without specifying details.
VentureBeat notes that while the modelās Apache license 2.0 means it can be freely used (with some minor conditions), it wonāt have live access to X content by default.
Already by the end of 2023, there are vastly more open-source models being released than closed. (Infographic dated November, 2023).
Keep reading with a 7-day free trial
Subscribe to Artificial Intelligence Survey š¤š¦š§ to keep reading this post and get 7 days of free access to the full post archives.