July 02, 2024 – Elon Musk, the renowned tech entrepreneur, has recently announced that his artificial intelligence startup, xAI, is set to unveil its large language model, Grok-2, in August. This new model promises to usher in more advanced AI capabilities, marking a significant milestone in the field of artificial intelligence.
While the highly anticipated Grok-2 is still under wraps, Musk has already begun building buzz for its successor, Grok-3. In a recent statement, Musk emphasized the importance of datasets in training AI chatbots and acknowledged the considerable effort required to cleanse large language models (LMMs) of existing data. He also touched on several issues related to the output training of OpenAI models.
In a revealing disclosure, Musk mentioned that Grok-3 is being trained using a staggering 100,000 NVIDIA H100 chips, specifically designed for processing large language model (LLM) data. The release of Grok-3 is slated for the end of the year, and Musk believes it will be “very special.”
The NVIDIA H100, an AI chip tailored for handling LLMs, carries an estimated price tag of between 30,000to40,000 per unit. However, bulk purchases may qualify for discounts. A simple calculation reveals that the 100,000 H100 chips utilized by xAI are valued at approximately 3to4 billion.
Interestingly, Musk had previously mentioned that Tesla’s estimated purchases from NVIDIA this year would amount to 3to4 billion. This leads to the reasonable speculation that xAI could be utilizing the NVIDIA chips purchased by Tesla for the training of Grok-3.
Industry experts and enthusiasts are eagerly awaiting the launch of both Grok-2 and Grok-3, as these models are expected to push the boundaries of what’s possible with artificial intelligence. Musk’s ambitious vision and investment in AI technology continue to shape the future of this rapidly evolving field.