UAE technology company G42-owned Core42, that focuses on AI solutions and services, launched Arabian LLM Jais 30B which is the most advanced version of its open-source model. The 30 billion parameter model follows the release of the 13 billion parameter Jais model that was launched in August, further positioning UAE’s position in the AI LLM race.
The model was trained on the powerful AI supercomputer, Condor Galaxy- 1 (CG-1), with four exaFLOPS of training compute, 54 million cores, and 64-nodes, which was built by G42 in partnership with Cerebras Systems. Jais 30B was trained on larger datasets of 126 billion Arabic tokens, 251 billion English tokens and 50 billion code tokens with an increased performance compared to its predecessor. It also offers 160% longer and detailed answers in Arabic and a 233% increase in English. The model also presents better performance in summarisation too.
UAE’s constant endeavour to push themselves as a major player in the LLM race is ongoing. Recently, Abu Dhabi-based TIII released Falcon 180B, which is the largest open-source language model available.
Demographic Specific Model
Jais is born from the collaboration between Core42, Mohamed bin Zayed University of Artificial Intelligence and Cerebras Systems. Interestingly, OpenAI recently partnered with G42, the parent of Core42, which will most likely benefit both the parties in the LLM race. With Jais 30B, UAE’s focus towards creating demographic-specific models, similar to the likes of Bhashini and Indus models for India, is gaining momentum.
The post Core42 Launches Arabic Large Language Model Jais 30B to push Demographic-Specific Models appeared first on Analytics India Magazine.