After MWC24 with AI is complete and dusted, Nvidia is making the most of its market-leading position by pulling back the curtain on its own in-person event, GTC (GPU Technology Conference) 2024 . The company announced a wide range of new products, relationships, and initiatives, including its next move in communications, new graphics processing units (GPUs), and stronger relationships with major cloud platforms, currently the vendor’s largest customers. .
On the first day of the event, held in San Jose, California and streamed online, Nvidia made 40 separate announcements. This is very interesting content. The most talked-about announcement was the release of next-generation AI GPUs. microprocessor, B200 Blackwell The company says it can perform some calculations and operations 30 times faster than previous versions.
The move is clearly aimed at cementing Nvidia’s position as a leading provider of processors that enable AI training and inference processes. The company has an estimated 80% share of the AI GPU market, and this leadership position has driven significant revenue growth for the company, pushing its market valuation to over $2.2 billion and surpassing Microsoft and Apple. It is now the third largest company in the United States. And as tech giants like Amazon, Google, Microsoft, and OpenAI show that it makes more sense for them, they are gaining market share before competition from AMD, Intel, and others starts to erode their position. is likely to rise further. Rather than spend the money to develop its own AI chips, it buys Nvidia’s chips and has a deep partnership with the company (more on that relationship later).
Blackwell is named after David Blackwell, an American statistician and mathematician who invented dynamic programming, which is commonly used in the global financial industry and many sciences today. His work also significantly advanced game theory, probability theory, and information theory. Nvidia’s new chip is the successor to his highly influential and hugely profitable H100 Hopper series, which has driven AI advances in recent years. Introducing the new product at GTC 2024, his Nvidia founder and CEO Jensen Huang said that Blackwell is “now twice as fast as he was in Hopper,” but very importantly, Blackwell It is equipped with “calculation functions within the network” for even faster operation. Because it’s so fast, Huang claimed, it will be able to do amazing things, such as “transforming audio into 3D video.”
The Blackwell chip has an almost unbelievable 208 billion transistors, 128 billion more than Hopper, making this and other remarkable features possible. In fact, the mind begins to become confused. Huang added that the new chip has five times the AI performance of Hopper, while reducing energy consumption by 25 times.
He said it has “no memory locality issues, no cache issues, just a huge chip with huge power and utility,” but it will be “pretty expensive” to buy. .
Well, the H100 hoppers cost over $30,000 each, and while the B200 Blackwell’s price hasn’t been announced yet, it could be dizzyingly expensive. Nevertheless, whatever the price, the pace of AI development is such that Nvidia customers will be lining up to get their hands on Blackwell, regardless of cost. The new Blackwell chips are expected to be on the market by this summer.
Among the companies lining up to get them are hyperscale cloud giants. Nvidia further strengthened its relationships with these companies during the opening hours of GTC 2024. Nvidia announced partnership extensions with the following companies: Amazon Web Services, microsoft (including but not limited to Azure cloud operations), google cloud and oracle (including cloud operations).
Nvidia used GTC 2024 to clearly set out its intentions for the communications space. Vendors have developed relationships with radio access network (RAN) equipment vendors. network operator In recent months and during MWC24 as well, we announced the formation of the AI-RAN Alliance, which will focus on potential next-generation architectures for mobile access networks. See AI-RAN Alliance launch at #MWC24.
Now it has been introduced 6G Research Cloud Platform, a new set of software tools that can apply AI to the RAN. It consists of three main elements. Aerial Omniverse Digital Twin for 6G, “a reference application and developer sample that enables physically accurate simulation of complete 6G systems, from single towers to city scale.” Aerial CUDA-Accelerated RAN is a “software-defined full RAN stack that provides researchers with significant flexibility to customize, program, and test their 6G networks in real time.” Sionna Neural Radio Framework “offers seamless integration with popular frameworks such as PyTorch and TensorFlow, and leverages Nvidia GPUs to generate and capture data and train AI and machine learning models at scale. Masu.”
The platform enables organizations to accelerate the development of 6G technology that connects trillions of devices to cloud infrastructure, the foundation for a hyper-intelligent world supported by self-driving cars, smart spaces, and a wide range of augmented reality and immersive education. can be built. The vendor says initial participants include Ansys, Arm, ETH Zurich, Fujitsu, Keysight, Nokia, Northeastern University, Rohde & Schwarz, Samsung, SoftBank and Viavi. 6G Research Cloud partner.
And still in the communications field, Contact me individually announced The company plans to launch GPU-as-a-Service (GPUaaS) in Singapore and Southeast Asia in the third quarter of this year, “giving businesses access to Nvidia’s AI computing power to increase efficiency and grow.” and accelerate innovation.” Singtel first announced its partnership with Nvidia earlier this year. See Singtel enters into green AI and data center partnership.
Nvidia also announced a new suite of chips that can run chatbots in cars and trucks (which are really cool and essential), and another family of GPUs for creating humanoid robots. Currently, Nvidia definitely dominates his AI infrastructure space, but larger rivals such as AMD and Intel are not sitting idle either. Intel has developed his Gaudi AI accelerator, AMD is developing Instinct, while startups like Cerebras are also making waves.
Demand for AI chips is at an all-time high and is expected to continue growing this year and into next, so companies with at least enough AI chips in stock and ready to ship will win, while readily available inventory Companies that don’t have will be winners. will lose. The laws of the market also apply to the rarefied field of cutting-edge high-tech products.
– Martin Warwick, TelecomTV Editor-in-Chief. With additional reporting by Ray Le Maistre, TelecomTV Editorial Director


