Falcon Arabic sets sights on Arabic-language AI dominance

23 July 2025

This year could prove a pivotal one in the battle for Arabic AI supremacy, with the United Arab Emirates’ (UAE) Falcon Arabic representing the latest salvo. 

Reuters reports that Falcon Arabic, one of two new models in the Falcon ecosystem announced in May, “aims to capture the full linguistic diversity of the Arab world through a ‘high-quality native (non-translated) Arabic dataset.’" And given the UAE’s close relationship with the US and its supply of precious AI technology, it may well be poised to deliver on that ambition. Developed by Abu Dhabi's Advanced Technology Research Council (ATRC), the AI ecosystem is similarly bolstered by billions in government funding in a high-stakes race to develop the most sophisticated Arabic AI model.

"Today, AI leadership is not about scale for the sake of scale. It is about making powerful tools useful, usable, and universal," ATRC secretary general Faisal Al Bannai, said in a press release. 

The Arabic language faces some unique challenges in its AI due to its right-to-left configuration, its flowing script with no space between words, its morphological complexity, the abundance of dialects, a lack of robust datasets, and other linguistic, mechanical, and technological challenges. 

With those challenges in mind, Falcon AI’s aim to build a model that serves the entire Arabic world is no small feat. And Mobile World Live reports that it may be living up to the hype. The UAE’s Technology Innovation Institute (TII), the applied research wing of the ATRC, claims the laurels of best-performing model in its class, matching systems 10 times as large.

The secret to Falcon Arabic’s success is likely its high-quality dataset, according to Business Wire. Falcon 3-7B’s 7 billion parameters provide the model’s technological underpinnings, training it in both Modern Standard Arabic and regional dialects. 

Similarly impressive is Falcon-H1, a second model launched in May, for its small technological footprint. It only takes a single graphics processing unit (GPU) to run Falcon-H1, opening the AI to small innovators with big ideas. 

“We’re proud to finally bring Arabic to Falcon, and prouder still that the best-performing large language model in the Arab world was built in the UAE,” said ATRC Secretary General H.E. Faisal Al Bannai at the Make it in the Emirates event in Abu Dhabi, according to Business Wire. “Today, AI leadership is not about scale for the sake of scale. It is about making powerful tools useful, usable, and universal. Falcon-H1 reflects our commitment to delivering AI that works for everyone — not just the few.”   

 

Falcon-H1 supports European languages and scales to cover over 100 languages through “a multilingual tokenizer trained on diverse datasets.” Much like the shock the Chinese DeepSeek delivered to the world at the beginning of 2025, Falcon-H1 aims to deliver AI systems that can be run on qualifying local devices rather than through cloud computing. 

“Not everyone has access to high-end GPUs to run models,” Hakim Hacid, chief researcher at TII’s AI and Digital Science Research Centre, told Capacity magazine. “We paid a lot of attention to offering models that can run on consumer devices so that people can access this model to run applications on their devices instead of running them in the cloud.”