News

A data center in Oklahoma City, Oklahoma, has been sold and looks likely to serve chip firm Cerebras. CoStar reports Scale ...
At over 2,500 t/s, Cerebras claims to have has set a world record for LLM inference speed on the 400B parameter Llama 4 ...
Nvidia announced that 8 Blackwell GPUs in a DGX B200 could demonstrate 1,000 tokens per second (TPS) per user on Meta’s Llama ...
Cerebras Systems has just officially announced Cerebras Inference, which is considered the world's fastest AI inference ...
Meta Platforms (NasdaqGS:META) has recently announced a significant collaboration with Red Hat, aimed at advancing generative AI technologies for enterprise applications. This partnership, along with ...
Growth has since slowed dramatically. AI is a different story, and chipmaker Cerebras provided an update of sorts this week. Cerebras filed to go public in September, but the process was slowed ...
Artificial intelligence chip startup Cerebras Systems Inc. is heralding the launch of Qwen3-32B, one of the most advanced and powerful open-weight large language models in the world, as proof of ...
Cerebras CEO Andrew Feldman said his hope is to take his company public in 2025 now that the chipmaker has obtained clearance from the U.S. government to sell shares to an entity in the United ...