Monday, 1 December 2025
27.2 C
Singapore
15.6 C
Thailand
20.6 C
Indonesia
27.5 C
Philippines

The founder says Chinese AI can thrive with bigger models and more data

Stepfun's founder champions scaling laws and multimodality in AI development, predicting a trillion-parameter model revolution in China's AI industry.

If you follow the latest developments in artificial intelligence, you’ll find that bigger models and more data are the keys to success. Jiang Daxin, the founder of Stepfun, a Shanghai-based AI start-up, believes in the power of scaling laws in large language model (LLM) development. Despite challenges like lower investment and a lack of advanced chips in China, Jiang remains optimistic.

Jiang, who used to work at Microsoft, shared his thoughts at the World Artificial Intelligence Conference (WAIC) in Shanghai. He predicts that LLMs will eventually reach hundreds of trillions of parameters, greatly enhancing their capabilities.

The promise of scaling laws

Scaling laws are all about the relationship between an AI model’s performance and its number of parameters. Generally, larger models perform better, especially with more data and excellent computational resources, although the improvements can slow down after a certain point. Big tech companies invest heavily in advanced technology, particularly Nvidia’s H100 chips, to maximize performance.

Jiang highlighted this trend in his talk. “The advancements in OpenAI’s GPT series, which powers ChatGPT, and the massive investments in supercomputing centers by companies like Amazon, Microsoft, and Meta show that scaling laws work,” he said on Saturday. However, he cautioned that the availability of data, skilled personnel, and concerns about return on investment could affect the pace of these advancements.

Since OpenAI launched ChatGPT in late 2022, Chinese tech giants and start-ups have been eager to develop their LLMs. China has over 200 AI models, including Alibaba’s Tongyi Qianwen and Baidu’s Ernie. Alibaba owns the South China Morning Post, which reported this news. Yet, many Chinese AI firms struggle to match the spending power of their US counterparts and focus instead on revenue-generating applications.

Stepfun’s innovative models

Founded in April 2023, Stepfun has been dedicated to developing fundamental models. At WAIC, the company launched Step-2, a trillion-parameter LLM, along with the Step-1.5V multimodal model and the Step-1X image generation model.

Jiang also emphasized the importance of multimodality in creating a comprehensive AI. Multimodal models can process visual and other data types to develop internal representations of the external world. He explained that Stepfun aims to combine generative and comprehension abilities in a single model.

Stepfun also offers consumer-facing products, such as Yuewen, a ChatGPT-like personal assistant, and Maopaoya, an AI companion that can take on various character personalities.

The future of AI investment

“Last year, global AI investments reached US$22.4 billion, with 70 to 80 percent going to companies developing large models,” said Alex Zhou Zhifeng, managing partner at Qiming Venture Partners, at another WAIC side event. Qiming was an early investor in Stepfun.

Zhou noted that more investments in AI applications are expected soon, partly due to decreasing token costs. In AI, a token is a basic data unit processed by algorithms.

Peng Wensheng, an economist at China International Capital, added that China’s AI model market is projected to reach about 5.2 trillion yuan (US$715.1 billion) by 2030. The size of the size of the industrial AI market is expected to be around 9.4 trillion yuan.

This optimistic outlook suggests a bright future for AI development in China, driven by the potential of scaling laws and innovative models like those from Stepfun.

Hot this week

Sumsub reports sharp rise in synthetic personal data fraud in APAC

Sumsub reports a sharp rise in synthetic identity fraud and deepfake attacks across APAC as AI-driven scams become more sophisticated.

IBM expands Storage Scale System 6000 to support full-rack capacity of 47PB

IBM expands its Storage Scale System 6000 to a full-rack capacity of 47PB, boosting performance for AI, supercomputing, and large-scale data workloads.

Google DeepMind opens new AI research lab in Singapore to strengthen regional language capabilities

Google DeepMind opens a new AI lab in Singapore to boost regional language understanding, research partnerships, and real-world innovation.

Cybercriminals use fake Battlefield 6 downloads and trainers to spread malware

Malware disguised as pirated Battlefield 6 downloads and trainers is targeting players with stealers and C2 agents.

ChatGPT introduces new shopping research tool for personalised product guidance

ChatGPT launches a shopping research tool that creates personalised buyer’s guides through interactive product discovery.

Honor showcases early low-light camera performance of the Magic 8 Pro

Honor offers an early look at the Magic 8 Pro’s upgraded low-light camera performance during brief testing at the Singapore Oceanarium.

Porsche unveils new electric-only Cayenne with up to 1,140hp and wireless charging

Porsche launches the new electric-only Cayenne with up to 1,140hp, ultra-fast charging, wireless charging, and improved practicality.

Team Cherry confirms more Silksong content without a release date

Team Cherry is working on new Hollow Knight: Silksong content, but no release date has been announced.

Ayaneo unveils the Next II, a powerful handheld with a 9-inch display

Ayaneo reveals the Next II handheld with a 9-inch OLED display, a Ryzen AI Max+ chip, and advanced controls, aimed at high-end gamers.

Related Articles

Popular Categories