You’ve likely heard a lot about artificial intelligence lately, but here’s a major development that could shape the future of AI. Alibaba Group’s open-source AI model, Qwen, has powered a new agentic framework, DeepSWE, to the top of a major global ranking. Developed by Agentica and San Francisco-based Together AI, DeepSWE has achieved 59% accuracy on the SWEBench-Verified test, outperforming all other open-weight models, including DeepSeek’s V3-0324.
This achievement is no small feat. DeepSWE is a software agent built on the Qwen3-32B large language model, which is part of Alibaba Cloud’s latest third-generation AI family. These agentic frameworks are software platforms that allow artificial intelligence agents to act more like humans, planning, collaborating, making decisions, and solving problems independently.
Why does this matter to you
If you use apps or tools that handle tasks automatically—whether it’s customer support, content creation, or software development—this win by DeepSWE is something to watch. AI agents like DeepSWE aren’t just trained to chat. They can write code, fix bugs, and work through complex problems on platforms like GitHub. In essence, they can act like digital assistants for developers, helping reduce the heavy lifting of software engineering.
What makes DeepSWE different? It’s been trained using Agentica’s rLLM, a modular reinforcement learning system. The team behind it didn’t just keep the results to themselves—they’ve open-sourced everything. That means developers worldwide can access the dataset, code, training process, and even the evaluation logs. This transparency enables other teams to build, improve, and scale their own AI agents more quickly.
Training DeepSWE wasn’t a weekend project. It ran for six days on powerful computers using Nvidia’s high-end H100 graphics processors. This intensive training enabled the model to learn and handle detailed and high-level software tasks efficiently.
Alibaba’s rise in open-source AI
This success adds to Alibaba’s growing influence in the global open-source AI scene. The company, based in Hangzhou, started releasing the Qwen models to the public in August 2023. By April 2024, it had already released more than 200 open-source Qwen models, which together had been downloaded over 300 million times and inspired 100,000 derivative models worldwide.
The Qwen3 series, released in April, supports platforms like Ollama, LM Studio, SGLang, and vLLM. According to tests run by Alibaba, some Qwen3 models—such as Qwen3-235B and Qwen3-4B—have matched or even beaten the likes of OpenAI’s o1, Google’s Gemini, and DeepSeek’s R1 in tasks like coding support, text generation, and solving complex mathematical problems.
Last month, Alibaba’s chairman, Joe Tsai and CEO, Eddie Wu Yongming, proudly stated in a letter to shareholders that Qwen is now the world’s largest open-source AI model family. They emphasised that this strategy is part of a broader push to promote global adoption of Chinese-developed AI systems.
Massive investment in AI’s future
Alibaba Cloud is not slowing down. On July 4, the company announced a fresh investment of more than US$60 million to accelerate AI innovation through its partner ecosystem before the end of its current financial year in March.
This follows CEO Wu’s February pledge to invest at least 380 billion yuan (US$53 billion) over the next three years in AI and cloud computing infrastructure. It’s set to be the largest computing project ever backed by a private company in China.
So, what does this mean for you? Whether you’re a tech enthusiast, developer, or someone interested in how AI can make life easier, Alibaba’s Qwen-based DeepSWE is a clear sign that open-source models are not just catching up—they’re leading the way.