The world’s interest in AI is growing, with most of the attention focused on countries with many of the world’s leading AI models, such as the US and China. Southeast Asia - one of the world’s important economic regions - is gradually becoming a new hot spot for AI, with major advances that could interest global policymakers, investors and technology experts.

In Vietnam, the Party and Government are also determined to shape and promote the era of technological development of the country, demonstrated through Resolution 57-NQ/TW on Breakthrough in science and technology development, innovation and national digital transformation. This orientation has created a strong driving force for domestic technology enterprises to invest and exploit the potential of advanced global technology.
Currently, Vietnam has become one of the few countries in Southeast Asia to own a domestic large language model (LLM). Specifically, since 2023, Zalo has successfully trained and launched an LLM model focusing on Vietnamese, researched and developed entirely by a team of Vietnamese engineers.
Choose training from scratch and get unexpected results
Currently, AI models have 2 training techniques including: fine-tuned model technique is the method of optimizing previously trained LLMs to create new LLMs for specialized purposes; from-scratch model training technique is the process of building a completely new model, from parameter initialization, deciding on model architecture to training algorithm on a certain data set.
Among them, fine-tuning techniques are chosen by many businesses because of their advantages such as easy implementation, saving resources and being able to give better efficiency. Especially, in the context of Vietnam where training equipment and data are limited, fine-tuning techniques are a superior solution.
However, Zalo chose the training technique from the beginning. With this method, the entire training process and model are completely owned and controlled by Vietnamese people. Thanks to that, Vietnam has become one of the few countries in Southeast Asia to own a large language model (LLM) developed domestically.
At launch in 2023, Zalo's first large-scale, 7-billion-parameter Vietnamese-focused language model achieved 150% performance compared to OpenAI's GPT3.5 on the VMLU Vietnamese LLM Competency Benchmark. The training time was only 6 months, much shorter than the original 18-month plan. This rapid training process surprised the entire Zalo development team.

Zalo's LLM model ranked 3rd in the Kahoot challenge in its first launch in 2023 (Photo: Zalo).
In 2024, Zalo's 13 billion parameter model surpassed the world 's big names to affirm its position as the top 2 Vietnamese LLM models trained from scratch on VMLU's Vietnamese LLM competency ranking.
The results show that the level of training of large language models is not inferior to the world's to develop an AI model of Vietnam's own, especially in the context of initial development facing many difficulties.
Vietnamese efforts to develop AI models
Zalo representative said that LLM training requires 3 core elements: training equipment, data and technical level. Previously, Vietnam still had many limitations in all 3 aspects. Specifically, while large companies in the world have owned thousands of the latest high-performance GPUs from Nvidia, engineers in Vietnam have not been fully equipped with the necessary server infrastructure. At the same time, Vietnamese is also ranked in the group with poorer data resources than English or Chinese. Vietnam's human resources and LLM training experience are also limited when compared to developed countries.
The Zalo team at that time had to research and experiment on small civilian GPUs to gain knowledge and training capacity for LLM, ready as soon as large computing infrastructure was available.
AI training chips are scarce, so even though Zalo has ordered 8 Nvidia DGX H100 servers, it cannot own all the devices at once and must wait for each batch of delivery from the manufacturer. Therefore, optimizing the incomplete computing infrastructure to save time for training is also a problem that the Zalo team must solve.
At the same time, quality training data is also invested in to compensate for the lack of Vietnamese data sources.

“Although starting from a difficult position compared to large companies in the world, Zalo still decided to join the race with the goal of successfully developing Vietnam’s own AI model. We consulted with researchers and engineers at many leading research institutes in the world to have a suitable development strategy.
The current milestones of success are the motivation for Zalo engineers to continue optimizing the model to be larger in quantity and better in quality. At the same time, exploiting the applicability to create many world-class AI products for Vietnamese users", Dr. Nguyen Truong Son, Director of Science at Zalo AI shared.

DGX H100 server Zalo ordered from Nvidia (Photo: Zalo).
Thanks to flexible adaptation in the difficult early stages of development, Zalo has gradually achieved successful milestones, moving towards mastering global AI technology as it is today.
Currently, Zalo's AI model is not only successful in terms of training research but also applied, promoting access and exploitation of value from advanced new technology for Vietnamese people.
Earlier this year, Zalo launched the Kiki Info Q&A Assistant, which is being operated as an OA - Official Account - on the Zalo messaging platform. The assistant supports Q&A on various topics in life, content creation and entertainment. According to Zalo statistics, the Kiki Info assistant has had 1 million users accessing the OA account on Zalo in less than 2 months.

Zalo's LLM model is applied to develop Kiki Info Assistant (Photo: Zalo).
Another application of Zalo's LLM model is AI cards, which also reached the milestone of 15 million cards created and sent in just 2 months. This is an application that many Zalo users are interested in to send wishes to relatives and friends on important holidays.
Currently, applications from large language models continue to be expanded and developed by Zalo, promising to bring many useful values to domestic users.
Source: https://dantri.com.vn/cong-nghe/zalo-phat-trien-mo-hinh-ai-do-nguoi-viet-lam-chu-20250616161352610.htm
Comment (0)