News

Cerebras launched its AI inference service last August. Inference refers to the process of running live data through a trained AI model to make a prediction or solve a task, and high performance is ...
At over 2,500 t/s, Cerebras claims to have has set a world record for LLM inference speed on the 400B parameter Llama 4 ...
A data center in Oklahoma City, Oklahoma, has been sold and looks likely to serve chip firm Cerebras. CoStar reports Scale ...
Meta partners with Cerebras to launch its new Llama API, offering developers AI inference speeds up to 18 times faster than traditional GPU solutions, challenging OpenAI and Google in the fast-growing ...
The market for serving up predictions from generative artificial intelligence, what's known as inference, is big business, with OpenAI reportedly on course to collect $3.4 billion in revenue this ...
Meta launches Llama 4 API with Groq and Cerebras as partners Llama models offer lower costs and faster output than competitors Buying Cerebras and Groq could help Meta fully own AI stack At ...
AI chipmaker Cerebras is trying to be the first major venture-backed tech company to go public in the U.S. since April and to capitalize on investors' insatiable demand for Nvidia, now valued at ...
Cerebras makes a chip that is 56 times the size of a chip commonly used for artificial intelligence, with the potential to crunch data much faster.Credit...Cayce Clifford for The New York Times ...
NEW YORK, Oct 8 (Reuters) - Cerebras Systems is likely to postpone its IPO, after facing delays with a U.S. national security review on UAE-based tech conglomerate G42's minority investment in the ...
NEW YORK, March 25 (Reuters) - Cerebras Systems executives were hoping the Trump administration would wave through a national security review that has left the AI chipmaker's IPO in limbo for ...
Cerebras Systems Inc. said it has resolved “all open issues” with the Committee on Foreign Investment in the US, a crucial step that may clear the chip startup’s path for its much ...