Musk’s xAI Unveils Grok-1: The Largest Open-Source Language Model Yet

March 18, 2024 – Musk’s AI venture, xAI, has officially announced that its large language model, Grok-1, is now open-source and available for public download.

According to reports, Grok-1 is a large language model built using Mixture-of-Experts (MoE) technology, boasting an impressive 314 billion parameters, significantly surpassing the 175 billion parameters of OpenAI’s GPT-3.5. This makes it the largest open-source large language model in terms of parameters to date, with its model weights and architecture released under the Apache 2.0 license.

xAI stated that Grok-1 has been trained entirely in-house and completed its pre-training phase in October 2023. The released version is the original base model checkpoint from the end of the pre-training phase, meaning the model has not undergone any fine-tuning for specific applications.

One of xAI’s goals is to compete with giants like OpenAI, Google, and Microsoft in the large model space. Its team comprises talent from various renowned organizations such as OpenAI, Google DeepMind, Google Research, and Microsoft Research.

Currently, xAI has not disclosed any specific benchmark results for Grok-1, making its performance and competition with other large models a topic of keen interest in the industry.


Leave a Reply