Hey Everyone,
Elon Musk’s X.AI did the expected: Grok was initially released as a proprietary or “closed source” model back in November 2023 and it was, until now, accessible only on Musk’s separate but related social network X, a costly subscription to say the least.
🚀 Breaking News: xAI has just released Grok-1, a 314B MoE model!
As did Open-Sora 1.0.
Start exploring Grok-1 now:
Let’s try to wrap our heads around this.
Elon Musk’s xAI has open-sourced the base code of Grok AI model, but without any training code. The company described it as the “314 billion parameter Mixture-of-Expert model” on GitHub.
314B, mixture of expert (2 out of 8 active). Even the active parameters only (86B) is more than the biggest Llama.
They say that Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI. It’s not clear what they mean by trained from scratch. All of these models are based on available data.
Is it Really Open-Source?
Model or training details were not open: In a blog post, xAI said that the model wasn’t tuned for any particular application such as using it for conversations. The company noted that Grok-1 was trained on a “custom” stack without specifying details.
VentureBeat notes that while the model’s Apache license 2.0 means it can be freely used (with some minor conditions), it won’t have live access to X content by default.
Already by the end of 2023, there are vastly more open-source models being released than closed. (Infographic dated November, 2023).
Keep reading with a 7-day free trial
Subscribe to Artificial Intelligence Survey 🤖🏦🧭 to keep reading this post and get 7 days of free access to the full post archives.