AMD sees AI chip market topping $500 bn by 2028, unveils MI350 GPU series

AMD forecasts the AI chip market to exceed $500 billion by 2028, unveiling the MI350 GPU series as a bold challenge to Nvidia’s dominance, signaling fierce competition and transformative growth in AI hardware. **

As the world hurtles deeper into the AI revolution, the competition to dominate the hardware that powers artificial intelligence is fiercer than ever. On June 13, 2025, Advanced Micro Devices (AMD), a Silicon Valley powerhouse, made a bold declaration that’s shaking up the semiconductor landscape: the AI chip market is poised to explode, surpassing a staggering $500 billion valuation by 2028. Alongside this optimistic forecast, AMD unveiled its latest GPU series—the MI350—intended to lead the charge in accelerating AI workloads for data centers worldwide. But what does this mean for the industry, and how does AMD’s new offering stack up in the fast-evolving AI hardware race? Let’s dive in.

The AI Chip Market: A $500 Billion Opportunity

AMD’s CEO Lisa Su put the spotlight on the AI chip market’s enormous growth potential during their recent Advancing AI event. The company estimates the market, which was valued at around $45 billion just a few years ago, will grow at an annualized rate exceeding 60%, reaching beyond $500 billion by 2028[3][4]. This projection underscores AI’s rising footprint—not only in tech giants’ data centers but across sectors that leverage AI for automation, natural language processing, generative AI, healthcare diagnostics, autonomous vehicles, and beyond.

This explosive growth is fueled by the rapid adoption of large language models, foundation models, and generative AI applications, which demand ever-more powerful and efficient computing infrastructure. AMD’s forecast aligns with industry-wide sentiment that AI-specific silicon will be the next battleground for chipmakers, competing fiercely with Nvidia, Intel, and emerging players.

Enter the MI350 GPU Series: AMD’s Answer to AI’s Hunger for Power

AMD’s newly announced MI350 GPU series represents a significant leap in their AI-focused silicon portfolio. Designed specifically for data centers and high-performance AI workloads, the MI350 GPUs leverage AMD’s latest architectural advancements, promising enhanced performance, efficiency, and scalability for training and inference tasks.

Although AMD has not officially disclosed detailed specs or benchmark comparisons yet, industry insiders note the MI350 aims to compete directly with Nvidia’s flagship AI GPUs, such as the H100 and newer iterations expected in 2025. The MI350’s architecture reportedly integrates cutting-edge tensor cores optimized for matrix operations—the backbone of neural network training—while improving power consumption metrics.

One particularly exciting aspect is AMD’s focus on interoperability and open software ecosystems. With growing demand for flexibility, AMD is doubling down on supporting open standards like ROCm and enhancing compatibility with popular AI frameworks such as PyTorch and TensorFlow. This approach may appeal to cloud providers and AI startups seeking alternatives to more proprietary solutions.

New Partnerships and Market Traction

AMD announced a new customer, xAI, Elon Musk’s AI startup, as part of its expanding AI client roster[1]. While details remain under wraps, this partnership signals AMD’s growing relevance in powering next-gen AI platforms. xAI’s focus on developing advanced AI models will require top-tier hardware, and AMD’s MI350 series is positioned to meet that need, potentially challenging Nvidia’s dominance in this segment.

Financially, Wall Street analysts maintain a cautiously optimistic outlook on AMD’s AI ambitions. With an average price target of around $130 per share and a consensus "Outperform" rating, investors are betting on AMD’s AI strategy underpinning long-term growth[1]. Yet, some caution remains as AMD has not provided detailed revenue forecasts for its AI business, reflecting the market’s nascent and highly competitive nature[2].

AMD vs. The Competition: A Quick Comparison

Feature AMD MI350 GPU Series Nvidia H100 & Successors Intel Gaudi & Ponte Vecchio
Target Market AI Training & Inference in Data Centers AI & HPC workloads in Data Centers AI & HPC, with focus on Hyperscalers
Architectural Focus Enhanced tensor cores, high performance-per-watt Hopper architecture, optimized tensor cores AI-specific accelerators, Xe architecture
Software Ecosystem ROCm, Open standards, PyTorch/TensorFlow support CUDA, cuDNN, broad AI software ecosystem OneAPI, growing AI software support
Partnerships xAI, expanding AI startups and cloud providers Multiple cloud giants, AI labs Cloud hyperscalers, HPC centers
Market Position Emerging challenger, aggressive AI focus Market leader in AI GPUs Niche player emphasizing open AI silicon

This table illustrates the fierce competition in the AI chip arena. While Nvidia retains a strong lead due to early investments and broad adoption, AMD’s aggressive push with MI350 and strategic partnerships is positioning them as a formidable challenger. Intel, meanwhile, focuses on specific HPC and AI niches with its Gaudi and Ponte Vecchio chips.

Historical Context and AMD’s AI Journey

AMD’s foray into AI chips isn’t brand new but has ramped up significantly over the past few years. Traditionally known for CPUs and gaming GPUs, AMD recognized the seismic shift AI represents and pivoted by investing heavily in AI-optimized GPUs and accelerators. The MI100 and MI200 series laid the groundwork with solid performance in scientific computing and AI training.

The MI350 series is the culmination of this sustained effort, integrating lessons learned and emerging technologies to compete head-to-head with Nvidia’s entrenched market position. AMD’s approach also leverages its CPU-GPU synergy, particularly with its EPYC server processors, enabling optimized heterogeneous computing platforms for AI workloads.

Future Implications: What’s Next for AMD and the AI Chip Market?

Looking ahead, AMD’s $500 billion AI chip market forecast isn’t just a hopeful number—it’s a reflection of AI’s growing pervasiveness in everyday technology and enterprise solutions. As AI models scale to trillions of parameters and real-time inference becomes critical, demand for specialized hardware will keep soaring.

AMD will need to continue innovating in chip design, software ecosystems, and strategic partnerships to maintain momentum. The MI350 series’ reception in the market, performance benchmarks, and adoption by major cloud players will be critical markers of success.

Moreover, geopolitical factors and supply chain dynamics will play a crucial role. With semiconductor manufacturing concentrated in a few global hubs, AMD’s partnerships with foundries and investment in advanced process nodes will impact its ability to meet demand.

Real-World Applications and Broader Impact

The impact of AMD’s AI chips extends beyond tech giants and data centers. Industries like healthcare are leveraging AI for diagnostics and personalized medicine, requiring robust computational backbones. Finance relies on AI for risk modeling and fraud detection, while autonomous vehicles and robotics depend on real-time AI inference powered by such chips.

By providing competitive alternatives to Nvidia’s GPUs, AMD fosters a more diverse ecosystem that can spur innovation, reduce costs, and accelerate AI adoption across sectors.

Conclusion

AMD’s announcement of the MI350 GPU series and its bullish $500 billion AI chip market forecast by 2028 marks a pivotal moment in the AI hardware race. As someone who’s tracked AI’s evolution, it’s clear this is more than just numbers or product launches—it’s a shifting of the guard in technology infrastructure.

The stakes are high, with AMD challenging incumbents, forging new partnerships, and betting on AI’s unstoppable growth. Whether the MI350 can capture significant market share will depend on performance, ecosystem support, and strategic execution. But one thing’s certain: the AI chip wars are far from over, and AMD is playing to win.

**

Share this article: