UAE Unveils GPT-4’s New Rival, Falcon 180B

Share via:

Adding to the list of open-source LLMs, Abu Dhabi’s TII has released Falcon 180B, a highly scaled-up version of Falcon 40B. According to the official blog post, this is the largest open-source language model, boasting a staggering 180 billion parameters. 

Back in June, the institute released the three variants of Falcon – 1B, 7B, and 40B.

It is trained on a dataset of 3.5 trillion tokens from TII’s RefinedWeb dataset, making it the longest single-epoch pretraining process for an openly accessible model. The training process involved the simultaneous use of up to 4096 GPUs, using Amazon The chat model was fine-tuned leveraging a combination of various extensive conversational datasets focused on chat and instructions.

Imagine that your moat is money and you try to compete with state level funding of the UAE

— Yam Peleg (@Yampeleg) September 6, 2023

Falcon 180B Vs Llama 2 Vs GPT 3.5

Currently topping the HuggingFace leaderboard, Falcon 180B surpasses Llama 2 in size by 2.5 times and utilises four times the computing power. It also outperforms when compared to Llama 2 70B and OpenAI’s GPT-3.5 in terms of MMLU. It also uses multi-query attention (MQA).

Additionally, it exhibits comparable results to Google’s PaLM 2-Large in assessments involving HellaSwag, LAMBADA, WebQuestions, Winogrande, PIQA, ARC, BoolQ, CB, COPA, RTE, WiC, WSC, and ReCoRD. However it is yet to perform as well as GPT- 4.

 Read more: Llama 2 vs GPT-4 vs Claude-2 

Commercial Use

While the research paper for the model has not yet been released, Falcon 180b can be commercially used but under very restrictive conditions, excluding any “hosting use”, making it less commercially friendly than previous Falcon models. 

Open Source Gets a New Player

Surpassing Meta’s Llama 2, this is the largest open-source language model. 

Even though Meta has been championing the open-sourcing ecosystem, it comes with its own set of restrictions like its complicated licensing policy. But even Meta seems to follow a closed-door approach with its upcoming models which are touted to be even bigger and better. 

However, meanwhile, no matter how much effort Meta puts in, the real controller of open source is OpenAI, as AIM reported earlier. But now with the release of Google’s Gemini getting closer, it is high time that OpenAI releases GPT-5 to stay ahead in the race.

Read more: Meta Launches Open Source Models, OpenAI Controls Them

The post UAE Unveils GPT-4’s New Rival, Falcon 180B appeared first on Analytics India Magazine.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Sarthak Luthra
Sarthak Luthra
Hey, there! I am the tech guy. I get things running around here and I post sometimes. ~ naam toh suna hi hoga, ab kaam bhi dekhlo :-)

Popular

More Like this

UAE Unveils GPT-4’s New Rival, Falcon 180B

Adding to the list of open-source LLMs, Abu Dhabi’s TII has released Falcon 180B, a highly scaled-up version of Falcon 40B. According to the official blog post, this is the largest open-source language model, boasting a staggering 180 billion parameters. 

Back in June, the institute released the three variants of Falcon – 1B, 7B, and 40B.

It is trained on a dataset of 3.5 trillion tokens from TII’s RefinedWeb dataset, making it the longest single-epoch pretraining process for an openly accessible model. The training process involved the simultaneous use of up to 4096 GPUs, using Amazon The chat model was fine-tuned leveraging a combination of various extensive conversational datasets focused on chat and instructions.

Imagine that your moat is money and you try to compete with state level funding of the UAE

— Yam Peleg (@Yampeleg) September 6, 2023

Falcon 180B Vs Llama 2 Vs GPT 3.5

Currently topping the HuggingFace leaderboard, Falcon 180B surpasses Llama 2 in size by 2.5 times and utilises four times the computing power. It also outperforms when compared to Llama 2 70B and OpenAI’s GPT-3.5 in terms of MMLU. It also uses multi-query attention (MQA).

Additionally, it exhibits comparable results to Google’s PaLM 2-Large in assessments involving HellaSwag, LAMBADA, WebQuestions, Winogrande, PIQA, ARC, BoolQ, CB, COPA, RTE, WiC, WSC, and ReCoRD. However it is yet to perform as well as GPT- 4.

 Read more: Llama 2 vs GPT-4 vs Claude-2 

Commercial Use

While the research paper for the model has not yet been released, Falcon 180b can be commercially used but under very restrictive conditions, excluding any “hosting use”, making it less commercially friendly than previous Falcon models. 

Open Source Gets a New Player

Surpassing Meta’s Llama 2, this is the largest open-source language model. 

Even though Meta has been championing the open-sourcing ecosystem, it comes with its own set of restrictions like its complicated licensing policy. But even Meta seems to follow a closed-door approach with its upcoming models which are touted to be even bigger and better. 

However, meanwhile, no matter how much effort Meta puts in, the real controller of open source is OpenAI, as AIM reported earlier. But now with the release of Google’s Gemini getting closer, it is high time that OpenAI releases GPT-5 to stay ahead in the race.

Read more: Meta Launches Open Source Models, OpenAI Controls Them

The post UAE Unveils GPT-4’s New Rival, Falcon 180B appeared first on Analytics India Magazine.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

Sarthak Luthra
Sarthak Luthra
Hey, there! I am the tech guy. I get things running around here and I post sometimes. ~ naam toh suna hi hoga, ab kaam bhi dekhlo :-)

More like this

Singapore, Hong Kong stand out among blockchain heavyweights

A composite index by ApeX Protocol ranked the...

Elon Musk’s xAI lands $6B in new cash to...

xAI, Elon Musk’s AI company, has raised $6...

SaaS Unicorn LeadSquared Posts INR 162 Cr Loss In...

SUMMARY LeadSquared reported a marginal 0.73% increase in its...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!