AMD launches MI350 AI chip line to rival Nvidia’s Blackwell processors, debuts AI cloud service

AMD Unveils Powerful New AI Chips and Cloud Platform at Advancing AI Event
At its Advancing AI event in San Jose, California, AMD officially launched its MI350 series of AI chips, a significant leap aimed at directly competing with Nvidia’s powerful Blackwell GPU line. The event also featured announcements about AMD’s next-generation MI400 GPUs, as well as the introduction of the AMD Developer Cloud, which gives developers direct cloud-based access to AMD’s AI chips.
MI350 Line Targets High-Performance AI Computing
The MI350 series includes the MI350X and MI355X chips, both designed for intense AI workloads such as machine learning model training and inferencing. According to AMD, these new chips offer:
- 4x improvement in AI compute performance
- 35x increase in inferencing capabilities compared to its previous generation.
Each chip is equipped with 288GB of HBM3E memory, surpassing the 192GB found in Nvidia’s Blackwell chips. However, Nvidia’s GB200 superchip uses a dual-GPU setup, totaling 384GB, giving it a slight edge in memory capacity.
Flexible GPU Deployment Options
Users can configure MI350 chips in two major ways:
- Single-chip setups: Individual MI350X or MI355X units.
- Platform setups: AMD’s pre-configured platforms combine 8 GPUs per unit, offering up to 2.3TB of memory.
These platforms can be deployed in air-cooled systems (up to 64 GPUs) or liquid-cooled systems (up to 128 GPUs), allowing flexibility for data centers with different cooling infrastructures.
AMD MI400 Series: The Future of AI Chips
Looking beyond the MI350, AMD previewed its MI400 GPU series, which is expected to launch in 2026. These chips are designed to compete with Nvidia’s upcoming GB300 Blackwell Ultra and Rubin AI GPUs.
Key specifications for MI400 GPUs include:
- Up to 432GB of HBM4 memory
- Memory bandwidth up to 19.6TB per second
This next-gen series signals AMD’s continued focus on pushing boundaries in memory speed and capacity, which are critical for handling the growing demands of generative AI and large language models.
AMD Developer Cloud: AI Hardware as a Service
Another major announcement was the launch of AMD Developer Cloud, a new cloud platform designed to democratize access to AMD’s AI processing capabilities.
Developers can now:
- Log in and access MI300 and MI350 chips via the cloud
- Perform AI training and inferencing tasks without purchasing costly hardware
- Speed up development by leveraging high-performance infrastructure on-demand
This move puts AMD in a more direct competitive position with cloud services that already host Nvidia’s GPUs and adds another layer of accessibility for startups, researchers, and developers.
Market Context: AMD Faces Financial and Geopolitical Headwinds
Despite its technological strides, AMD’s stock has not performed as well as Nvidia’s. Over the past 12 months:
- AMD shares are down ~24%
- Nvidia shares have gained over 19%
So far in 2025, AMD stock is slightly down 0.2%, whereas Nvidia is up 7%.
Both companies have been impacted by U.S. export restrictions on AI chips to China. AMD has forecasted an $800 million loss from the export ban, while Nvidia reported a $4.5 billion write-down and expects to miss out on an additional $8 billion in sales in its current fiscal quarter.
AMD’s Strategic Positioning Against Nvidia
With the launch of the MI350 and the preview of the MI400, AMD is clearly aiming to stake a more aggressive position in the AI hardware race. The combination of performance upgrades, higher memory specs, and the accessibility provided by AMD Developer Cloud reflects the company’s intent to gain more market share in a field currently dominated by Nvidia.
Though AMD still faces challenges, particularly from government regulation and investor sentiment, its latest announcements signal renewed momentum in the evolving AI landscape.