News

F or those who enjoy rooting for the underdog, the latest MLPerf benchmark results will disappoint: Nvidia’s GPUs have ...
When you buy through links on our articles, Future and its syndication partners may earn a commission. Credit: AMD / sam_naffziger AMD's Instinct MI325X accelerator was supposed to compete with ...
The Series A funds will be used, in part, to cover the costs of a new AI training cluster with 8,192 Instinct MI325X, a gussied up MI300X with more HBM ... the memory on the jump Nvidia made from the ...
Founded in 2023, Las Vegas-based TensorWave provides companies with access to AI compute availability via AMD Instinct GPUs ... offers 1.8x more capacity and 1.3x more bandwidth than Nvidia's H200 GPU ...
AMD executives state that AMD's top priority is the development of the 'ROCm (Radeon Open Compute platform),' and the company will fight back against NVIDIA ... on the AMD Instinct MI300X GPU ...
Launching in mid-2025, MI350 is a massive leap forward - AMD is promising a 35× performance bump over the MI300X ... ROCm - AMD’s answer to Nvidia’s CUDA - doesn’t directly bring in ...
In Q1, AMD’s data center revenue surged 57% YoY to $3.7 billion. This growth was driven by CPU share gains and “strong growth of AMD Instinct ... MI300X’s 192GB HBM advantage (compared to ...
The company's fourth-generation EPYC CPUs serve as the foundation for its expanding data center presence, while its specialized portfolio, including the Instinct MI300X Series AI accelerators and ...
Nvidia wins MLPerf once again, by a mile. But AMD demonstrated it can compete with the older H200 in training smaller models.