AMD's MI300X is seeing increased adoption as enterprises seek alternatives to NVIDIA's supply-constrained GPUs, with 192GB memory enabling larger model deployments.
192GB HBM3 is 2.4x more memory than H100 80GB
Dec 10, 2025ROCm software stack now supports major frameworks
Dec 15, 2025Together AI and Lambda Labs added MI300X instances
Dec 18, 2025