The Register on MSN14h
DeepSeek-R1-beating perf in a 32B package? El Reg digs its claws into Alibaba's QwQHow to tame its hypersensitive hyperparameters and get it running on your PC Hands on How much can reinforcement learning - ...
While DeepSeek-R1 operates with 671 billion parameters, QwQ-32B achieves comparable performance with a much smaller footprint ...
A dense AI model with 32B parameters, excelling in coding, math, and local deployment. Compact, efficient, and powerful ...
After DeepSeek sparked a revolution in China's AI industry in early 2025, Alibaba's Tongyi Qianwen QwQ-32B is poised to become the next widely adopted large model, thanks to its parameters and ...
Alibaba (BABA) is stepping up its efforts in the AI race. The company has launched an upgraded version of its AI assistant ...
Alibaba’s QWQ-32B is a 32-billion-parameter AI designed for mathematical reasoning and coding. Unlike massive models, it ...
B, an AI model rivaling OpenAI and DeepSeek with 98% lower compute costs. A game-changer in AI efficiency, boosting Alibaba’s ...
Alibaba Cloud on Thursday launched QwQ-32B, a compact reasoning model built on its latest large language model ( LLM ), Qwen2 ...
Alibaba just unveiled its latest reasoning model, the QwQ-32b. It's said to rival DeepSeek at a much lower cost.
Albibab Cloud’s latest model rivals much larger competitors with just 32 billion parameters in what it views as a critical ...
Hong-Kong listed shares (HK:9988) jumped over 8% today, after the company launched its new open-source AI (artificial ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results