Singapore picks Alibaba’s Qwen to drive regional language model in big win for China tech

AI Singapore (AISG) – a national programme by the city state of Singapore to accelerate the adoption of artificial intelligence – has chosen to base its latest large language model on Alibaba’s Qwen, in a significant win for the Chinese technology giant as it promotes its AI services in Southeast Asia.

AI Singapore, designed to enhance the city state’s national AI capabilities, had released a new model, Qwen-SEA-LION-v4, based on Alibaba’s Qwen3-32B foundation model to better address the linguistic and cultural demands of the region, Alibaba Cloud said in a statement. Alibaba Group Holding owns the South China Morning Post.

An early version of the SEA-LION models was based on Llama, the open-source large language model developed by US tech giant Meta.

AISG in August also released a multimodal model for Southeast Asia based on Gemma, an open-source model developed by Google DeepMind.

Competition between Chinese open-source models and US models in third-party countries like Singapore is closely watched, as it has broad implications for the world’s AI landscape.

The roll-out of models such as DeepSeek as well as Qwen has significantly boosted China’s competitiveness in the arena.

The cooperation between Alibaba Cloud and AISG underscores how Alibaba’s open-source strategy has helped the company’s AI model family maintain its appeal among global AI developers.

Alibaba’s Qwen series, first open-sourced in August 2023, has now become one of the world’s largest open-source model families. The series had achieved over 600 million downloads and more than 170,000 derivative models created by developers worldwide, the company said in September.

Alibaba’s Qwen series, first open-sourced in August 2023, has now become one of the world’s largest open-source model families. Photo: Shutterstock
Alibaba’s Qwen series, first open-sourced in August 2023, has now become one of the world’s largest open-source model families. Photo: Shutterstock

The Qwen3-32B model is part of Alibaba’s Qwen3 series launched this April, which consists of eight enhanced models that range from 600 million to 235 billion parameters.

The collaboration marked “an important milestone in advancing AI inclusivity and making it more representative of Southeast Asia”, said Leslie Teo, senior director of AI products at AI Singapore.

“It embodies our shared vision of accelerating AI innovation across the region and ensuring that developers, enterprises and public institutions have access to AI that is open, affordable and locally relevant,” he added.

The latest model was trained on over 100 billion tokens of Southeast Asian languages, a move that is expected to improve its ability to interpret local expressions and regional knowledge domains, as mainstream models typically have a better understanding of English or Chinese.

The model currently ranks first among open-source models under 200 billion parameters on the SEA-HELM leaderboard, a benchmark for regional language performance.

Meanwhile, the Qwen model series has also been winning over tech firms in the West, reflecting its appeal compared with major AI models created by US developers.

Airbnb, the San Francisco-based online accommodation booking giant, relied heavily on the Qwen models to power its AI-driven customer service agent, the company’s co-founder and CEO Brian Chesky said in October, while adding that ChatGPT’s integration abilities were not “quite ready” for Airbnb’s needs.

×