Back
NVIDIA H100 Pricing (September 2025): Cheapest On-Demand Cloud GPU Rates
Comparison of single-GPU H100 hourly costs on AWS, Azure, GCP, Lambda Labs, RunPod, Vast.ai, and other U.S. providers.
Published:
May 19, 2025
Last updated:
Sep 16, 2025

The table below compares current on-demand, hourly rental prices for a single NVIDIA H100 80GB GPU across major U.S. cloud providers. Prices are normalized per-GPU (even if only multi-GPU instances are offered) and reflect standard on-demand rates in U.S. regions (no spot, reserved, or non-US pricing).
Provider | SKU/Instance | On-Demand $/GPU-hr* | Notes |
---|---|---|---|
AWS | p5.48xlarge (8 × H100 80GB) | ~$7.57 | 8-GPU instance (p5.48xl) at ~$60.54/hr total. |
Azure | NC H100 v5 VM (1 × H100 80GB) | $6.98 | Single H100 GPU VM in East US region. |
Google Cloud | A3 High (a3-highgpu-1g, 1 × H100) | ~$11.06 | 1× H100 80GB in us-central (on-demand). |
Oracle Cloud | BM.GPU.H100.8 (8 × H100 80GB) | ~$10.00 | 8-GPU bare-metal node; $80.00/hr |
Lambda Labs | 8× NVIDIA H100 SXM (80GB) | $2.99 | 8-GPU Lambda Cloud instance on HGX system. |
CoreWeave | 8× H100 HGX (80GB, InfiniBand) | ~$6.16 | 8-GPU HPC node w/ InfiniBand, $49.24/hr total. |
Paperspace | H100 80GB (Dedicated instance) | $5.95 | On-demand price per H100; multi-GPU discounts available. |
RunPod | H100 80GB (PCIe) | $1.99 | Community cloud price (per GPU); Secure Cloud is $2.39/hr. |
Vast.ai | H100 80GB (various hosts) | ~$1.87 | Marketplace lowest current price per H100 GPU. |
*Normalized September 2025 cost per single H100 GPU, even when only multi-GPU instances are offered by the provider.
Methodology (why you can trust these numbers)
On-demand only: No reserved-instance, commitment, or prepaid discounts.
Same class of silicon: All prices refer to NVIDIA H100 80GB GPUs. Thunder Compute’s A100 80GB rate is also shown to help developers evaluate cost-performance tradeoffs.
Public price lists: Every figure comes from the provider’s current pricing page (or public documentation) on the date above; where a provider sells only 8-GPU nodes we divide by eight to get a single-GPU equivalent.
USD in U.S. regions: Rates elsewhere can differ by 5-20 %.
A100 cost-performance benchmark
Thunder Compute (A100) | A100 80GB** | $0.78 | Not an H100; shown for cost-performance comparison |
**Thunder Compute’s A100 80GB rate is included for reference. While A100s are a generation older, they remain highly capable for many workloads and offer dramatically better cost-efficiency for prototyping, fine-tuning, and small-scale training.
Why this matters for developers
Provider | 2 hrs runtime | Effective cost |
---|---|---|
Thunder Compute – A100 80 GB | 2 × $0.78 | $1.56 |
Vast.ai | 2 × $1.87 | $3.74 |
RunPod | 2 × $2.69 | $5.38 |
Lambda Labs | 2 × $2.99 | $5.98 |
Azure | 2 × $6.98 | $13.96 |
Google Cloud | 2 × $11.06 | $22.12 |
AWS | 2 × $7.54 | $15.14 |
Oracle | 2 x $10.00 | $20.00 |
Price sources: Thunder Compute pricing page, Lambda Labs “GPU Cloud” grid, RunPod pricing, Vast.ai median market price, and the DataCrunch hyperscaler comparison for AWS, Google Cloud, and Azure. vast.ai
Result: Two hours on Thunder Compute’s A100 costs less than 15 minutes on AWS or GCP H100s—and the A100 still gives you roughly 15× more runtime per dollar than hyperscaler H100s.
Takeaways
Thunder Compute’s A100 rate is 4–8× cheaper than AWS or GCP and ≈2× cheaper than Azure, while its A100 remains the absolute price-performance leader.
Specialized providers like Vast.ai, RunPod, and Lambda have narrowed the gap, but they still charge 2–3× more than Thunder Compute for equivalent runtime.
Unless your workload truly needs H100 features (Transformer Engine, higher bandwidth, etc.), the A100 often delivers the best ROI for prototyping, fine-tuning, and small-scale training.
Bookmark this page—we refresh the numbers quarterly so you don’t have to.
Building a startup? See our analysis of Startup-Friendly GPU Cloud Providers for credit offers, and spin up an A100 or H100 on Thunder Compute

Carl Peterson
Try Thunder Compute
Start building AI/ML with the world's cheapest GPUs
Other articles you might like
Learn more about GPUs and more