October 09, 2023 – Russia has set an ambitious goal: to build up to 10 supercomputers by 2030, each capable of accommodating 10,000 to 15,000 NVIDIA H100 GPUs, providing the country with performance nearly equivalent to that used to train large language models like ChatGPT.
According to reports from foreign media outlets like Tom’s Hardware, Russia, under the leadership of the “Trusted Infrastructure Team,” is attempting to redefine computational capabilities, aiming to “break through the limits of computing.” The project has an estimated budget of around $6 billion USD (approximately 43.86 billion CNY at the current exchange rate). However, due to rapid technological advancements, the required costs are expected to decrease in the coming years. It is estimated that by 2030, the cost of these systems may drop to $500-700 million USD (approximately 3.655 – 5.117 billion CNY at the current exchange rate).
Currently, Russia’s most powerful supercomputer is Yandex’s Chervonenkis, ranking 27th in the world in terms of computational power. Russia has a total of seven computers in the top 500 globally, with three of them belonging to Yandex.
By way of comparison, the United States boasts 150, China has 134, Germany has 36, and Japan has 33 supercomputers.
In previous reports, Russian tech giant Yandex expressed in an interview last month that their YandexGPT holds promising prospects compared to the American competitor ChatGPT developed by OpenAI.
Dmitry Masyuk, the head of Yandex’s Search and Advertising Technology Business Unit, stated that YandexGPT’s performance in generating Russian content has been steadily surpassing ChatGPT 3.5 and, in many cases, providing answers superior to ChatGPT 4.0. He further suggested that YandexGPT’s ability to compete with its American counterpart is “only a matter of time.”