TII Unveils Falcon Mamba 7B, Outperforming Llama 3.18B and Other SLMs

Share via:


The Technology Innovation Institute (TII), the applied research arm of Abu Dhabi’s Advanced Technology Research Council (ATRC), has launched the Falcon Mamba 7B, a groundbreaking addition to its Falcon series of LLMs. Open-source State Space Language Model (SSLM), Falcon Mamba 7B has been independently verified by Hugging Face to outshine all competitors.

Marking a significant departure from previous Falcon models, which relied on transformer-based architecture, the Falcon Mamba 7B introduces SSLM technology to the Falcon lineup. This model not only outperforms Meta’s Llama 3.1 8B, Llama 3 8B, and Mistral’s 7B in new benchmarks but also claims the top spot on Hugging Face’s tougher benchmark leaderboard.

Source: Falcon

SSLMs excel at processing complex, time-evolving information, making them ideal for tasks like book-length comprehension, estimation, forecasting, and control tasks. Falcon Mamba 7B demonstrates superior capabilities in Natural Language Processing, machine translation, text summarisation, computer vision, and audio processing, with significantly lower memory requirements compared to traditional transformer models.

H.E. Faisal Al Bannai, Secretary General of ATRC and Adviser to the UAE President for Strategic Research and Advanced Technology Affairs “The Falcon Mamba 7B marks TII’s fourth consecutive top-ranked AI model, reinforcing Abu Dhabi as a global hub for AI research and development. This achievement highlights the UAE’s unwavering commitment to innovation.”

Source: Falcon

TII Continues Growth with SLMs

In a focussed shift to building small language models, Hakim Hacid, executive director and acting chief researcher at Technology Innovation Institute (TII), had discussed the same with AIM in an exclusive interaction, earlier this year. 

“We were asking at some point the question, as to how big should we go? I think now the question is how small we could go by keeping a small model,” said Hacid, saying that they are exploring that path. 

Further, he said that they are making models smaller because, again, “if we want the pillar of the deployment to succeed, we need to actually have models that can run in devices, and in infrastructure that is not highly demanding.”

With over 45 million downloads of Falcon LLMs to date, the Falcon Mamba 7B continues TII’s tradition of pioneering research and open-source contributions. The model will be released under the TII Falcon License 2.0, a permissive software licence based on Apache 2.0, emphasising the responsible use of AI.

TII continues to build on the open-source culture and believes not everyone will be able to sustain it. “You need a lot of funding to sustain open-source and we believe that not everyone will be able to do it,” said Hacid. 



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

TII Unveils Falcon Mamba 7B, Outperforming Llama 3.18B and Other SLMs


The Technology Innovation Institute (TII), the applied research arm of Abu Dhabi’s Advanced Technology Research Council (ATRC), has launched the Falcon Mamba 7B, a groundbreaking addition to its Falcon series of LLMs. Open-source State Space Language Model (SSLM), Falcon Mamba 7B has been independently verified by Hugging Face to outshine all competitors.

Marking a significant departure from previous Falcon models, which relied on transformer-based architecture, the Falcon Mamba 7B introduces SSLM technology to the Falcon lineup. This model not only outperforms Meta’s Llama 3.1 8B, Llama 3 8B, and Mistral’s 7B in new benchmarks but also claims the top spot on Hugging Face’s tougher benchmark leaderboard.

Source: Falcon

SSLMs excel at processing complex, time-evolving information, making them ideal for tasks like book-length comprehension, estimation, forecasting, and control tasks. Falcon Mamba 7B demonstrates superior capabilities in Natural Language Processing, machine translation, text summarisation, computer vision, and audio processing, with significantly lower memory requirements compared to traditional transformer models.

H.E. Faisal Al Bannai, Secretary General of ATRC and Adviser to the UAE President for Strategic Research and Advanced Technology Affairs “The Falcon Mamba 7B marks TII’s fourth consecutive top-ranked AI model, reinforcing Abu Dhabi as a global hub for AI research and development. This achievement highlights the UAE’s unwavering commitment to innovation.”

Source: Falcon

TII Continues Growth with SLMs

In a focussed shift to building small language models, Hakim Hacid, executive director and acting chief researcher at Technology Innovation Institute (TII), had discussed the same with AIM in an exclusive interaction, earlier this year. 

“We were asking at some point the question, as to how big should we go? I think now the question is how small we could go by keeping a small model,” said Hacid, saying that they are exploring that path. 

Further, he said that they are making models smaller because, again, “if we want the pillar of the deployment to succeed, we need to actually have models that can run in devices, and in infrastructure that is not highly demanding.”

With over 45 million downloads of Falcon LLMs to date, the Falcon Mamba 7B continues TII’s tradition of pioneering research and open-source contributions. The model will be released under the TII Falcon License 2.0, a permissive software licence based on Apache 2.0, emphasising the responsible use of AI.

TII continues to build on the open-source culture and believes not everyone will be able to sustain it. “You need a lot of funding to sustain open-source and we believe that not everyone will be able to do it,” said Hacid. 



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

AI agents could be policing all crypto txs within...

Chainalysis CEO Michael Gronager believes “it’s not much...

Temu’s price game not enough for Southeast Asia

Temu’s SEA ecommerce strategy: testing the waters or...

Intel Gaudi 3 Finally Releasing Next Week

Intel’s plans for 2024 are going strong. The...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!