On a drizzly and windswept afternoon this summer season, I visited the headquarters of Rokid, a startup growing good glasses in Hangzhou, China. As I chatted with engineers, their phrases had been swiftly translated from Mandarin to English, after which transcribed onto a tiny translucent display screen simply above my proper eye utilizing one of many firm’s new prototype units.
Rokid’s high-tech spectacles use Qwen, an open-weight massive language mannequin developed by the Chinese language ecommerce big Alibaba.
Qwen—full title 通义千问 or Tōngyì Qiānwèn in Chinese language—shouldn’t be the most effective AI mannequin round. OpenAI’s GPT-5, Google’s Gemini 3, and Anthropic’s Claude typically rating greater on benchmarks designed to gauge completely different dimensions of machine cleverness. Neither is Qwen the primary actually cutting-edge open-weight mannequin, that being Meta’s Llama, which was launched by the social media big in 2023.
But Qwen, and different Chinese language fashions—from DeepSeek, Moonshot AI, Z.ai, and MiniMax—are more and more standard as a result of they’re each superb and really straightforward to tinker with. In line with HuggingFace, an organization that gives entry to AI fashions and code, downloads of open Chinese language fashions on its platform surpassed downloads for US ones in July of this yr. DeepSeek shook the world by releasing a cutting-edge massive language mannequin with a lot much less compute than US rivals, however OpenRouter, a platform that routes queries to completely different AI fashions, says Qwen has quickly risen in recognition by means of the yr to turn into the second-most-popular open mannequin on the earth.
Qwen can do most stuff you’d need from a sophisticated AI mannequin. For Rokid’s customers, this would possibly embody figuring out merchandise snapped by a built-in digital camera, getting instructions from a map, drafting messages, looking the online, and so forth. Since Qwen can simply be downloaded and modified, Rokid hosts a model of the mannequin, fine-tuned to go well with its functions. It’s also doable to run a teensy model of Qwen on smartphones or different units simply in case the web connection goes down.
Earlier than going to China I put in a small model of Qwen on my MacBook Air and used it to follow some fundamental Mandarin. For a lot of functions, modestly sized open supply fashions like Qwen are simply nearly as good because the behemoths that dwell inside massive information facilities.
The rise of Qwen and different Chinese language open-weight fashions has coincided with stumbles for some well-known American AI fashions within the final 12 months. When Meta unveiled Llama 4 in April 2025, the mannequin’s efficiency was a disappointment, failing to achieve the heights of standard benchmarks like LM Area. The slip left many builders in search of different open fashions to play with.
