Market insights
AMD MI300X Pricing (September 2025): Cheapest High‑Memory GPUs in the Cloud

September 16, 2025
7 mins read
Current on‑demand MI300X prices (per GPU)
*Normalized to a single MI300X, even when only multi‑GPU nodes are offered.
Methodology
Why you can trust these numbers:
- On‑demand only – no reservations or prepaid contracts.
- Same silicon – every figure refers to AMD MI300X (192 GB) SKUs.
- Public price lists – pulled September 16, 2025 from each provider’s published pricing pages.
- U.S. regions, USD – prices elsewhere vary by 5‑20 %.
Cost‑performance snapshot
Thunder Compute’s A100 still delivers ~2‑5× more runtime per dollar than most on‑demand MI300X options, making it the cost‑efficiency pick for prototyping and fine‑tuning.
Why this matters for developers
- More memory, new math – MI300X packs 192 GB HBM3 and FP8 support, ideal for 70 B+ parameter models.
- Price gap vs. H100 – Even the cheapest MI300X ($1.85/hr) costs ~25 % less than the lowest H100 price we tracked in July 2025.
- A100 still wins ROI – Unless your model truly needs 192 GB or FP8, an A100 80 GB at $0.78/hr often delivers better value.
- Bookmark this page – we refresh the numbers quarterly so you don’t have to.
Takeaways
- Vultr and TensorWave have opened the MI300X price war below $2/hr.
- RunPod remains the easiest MI300X self‑service experience under $3/hr.
- Hyperscalers (Azure, Oracle) still charge 3‑4× more than niche clouds.
- Thunder Compute A100s stay the absolute price‑performance leader for most workflows.