Alibaba has introduced a wave of new Qwen3 models for critical tasks including coding, complex reasoning and machine translation, which underscore the AI visionary’s ongoing commitment to support the open-source community and its ambition to push the AI innovation boundaries.
Qwen3-Coder: Alibaba’s Most Advanced Agentic AI Coding Model
Agentic AI is reshaping software development by enabling more autonomous, efficient, and accessible programming workflows. As a foundational component towards true AI agents, agentic coding represents one of the most promising frontiers in artificial intelligence.
In this context, Alibaba has launched Qwen3-Coder, its most advanced agentic AI coding model to date. Engineered for high-performance software development, Qwen3-Coder excels at a wide range of tasks—from generating code and managing complex development pipelines to debugging across entire codebases.
Built upon a Mixture-of-Expert (MOE) architecture, the flagship variant Qwen3-Coder-480B-A35B-Instruct delivers competitive performance against leading state-of-the-art (SOTA) models across key benchmarks in agentic coding, browser interaction, and tool usage. It comes with a total of 480 billion parameters that is activating only 35 billion per token.
Designed to tackle complex, real-world problems through multi-step interactions with external tools and environments, Qwen3-Coder features a context window of 256K tokens, extendable up to 1 million tokens. Its performance is further enhanced through innovative training techniques, including long-horizon reinforcement learning (agent RL) during post-training.
Qwen3-Coder-480B-A35B-Instruct achieves competitive results against leading state-of-the-art (SOTA) models across key benchmarks
Complementing the model, Alibaba is also open-sourcing Qwen Code, a powerful command-line interface (CLI) tool that allows developers to delegate engineering tasks to AI using natural language. With its open-source availability, strong agentic reasoning, and seamless compatibility with popular developer tools and interfaces, Qwen3-Coder is poised to become an essential asset for developers worldwide.
Qwen3-235B: Open-sourced LLM Delivering Massive Performance
Alibaba also introduced Qwen3-235B-A22B-Thinking-2507, a thinking model with an expanded thinking length, making it suitable for highly complex reasoning tasks.
Compared to the previous iteration, the model has an enhanced 256K long-context understanding capabilities, exhibiting significantly improved performance on reasoning tasks, including logical reasoning, mathematics, science, coding, and academic benchmarks that typically require human expertise. The model achieves state-of-the-art (SOTA) results among open-source thinking models, with competitive performance against other SOTA closed-source thinking models.
Qwen3-235B-A22B-Thinking-2507 achieves great performance across benchmarks
The same model also showcased remarkable general capabilities in areas such as instruction following, tool usage, text generation, and alignment with human preferences. It is now available on Hugging Face and GitHub.
Earlier this week, a non-thinking mode, Qwen3-235B-A22B-Instruct-2507-FP8, was also released. The model is highlighted with substantial gains in long-tail knowledge coverage across multiple languages, as well as better alignment with user preferences in subjective and open-ended tasks.
Qwen-MT: Upgraded Machine Translation Model Supporting 92 Languages
Lastly, Alibaba has unveiled the latest iteration of its machine translation model: Qwen-MT. Built on the powerful Qwen3 foundation, this model is trained on trillions of multilingual and translation-specific tokens, significantly enhancing its cross-lingual understanding and translation quality.
By incorporating advanced reinforcement learning techniques, Qwen-MT achieves notable gains in translation accuracy, fluency, and contextual coherence.
Qwen-MT supports high-quality translation across 92 major official languages and prominent dialects, covering over 95% of the global population. By enabling customizable prompt engineering, it delivers optimized translation performance tailored to complex, domain-specific, and mission-critical application scenarios. It also uses a lightweight Mixture-of-Experts (MoE) architecture to deliver high translation throughput with faster response times and reduced API costs.
With the release of the latest Qwen3-235B, Qwen3-Coder and Qwen-MT, Alibaba reaffirms its leadership in open-source AI innovation. With over 140,000 derivative models created, Alibaba’s series of 300 Qwen models have become one of the most widely adopted open-source AI series globally. By making leading-edge models and tools freely available, the company is empowering developers, researchers, and enterprises to build the next generation of intelligent applications and navigate multilingual communication challenges faster, more efficiently, and more inclusively than ever before.