Stability AI Releases Two Japanese based LLMs

Share via:

Stability AI Japan has recently released two Japanese language models, namely “Japanese Stable LM 3B-4E1T” and “Japanese Stable LM Gamma 7B.” The former boasts approximately 3 billion parameters, while the latter is a 7 billion parameters model. These models have been made available under the Apache 2.0 license for commercial use.

日本語大規模言語モデル「Japanese Stable LM 3B-4E1T」と「Japanese Stable LM Gamma 7B」をリリースしました

約30億と70億のパラメータを持つこれらのモデルは、日本語タスクの性能評価でトップクラスです
さらに、Apache 2.0ライセンスで商用利用も可能… pic.twitter.com/N9M31UhdL0

— Stability AI 日本公式 (@StabilityAI_JP) October 25, 2023

These models are built upon previously released English language models, specifically “Stable LM 3B-4E1T” and “Mistral-7B-v0.1,” published by Stability AI in August and Mistral AI in September 2023, respectively. These models were initially trained with predominantly English data, resulting in high proficiency in English but limited Japanese language capabilities due to the scarcity of Japanese data. 

To enhance their Japanese language abilities, these models underwent continued pretraining, utilizing Japanese and English datasets from sources like Wikipedia, mC4, CC-100, OSCAR, and SlimPajama (excluding Books3), amounting to approximately 100 billion tokens.

The performance evaluation of these models followed the same methodology as the one used for “Japanese Stable LM Alpha,” released in August 2023. The evaluation included Japanese language understanding benchmarks (JGLUE) tasks, encompassing tasks such as sentence classification, sentence pair classification, question answering, and text summarization, totaling eight tasks.

The Japanese Stable LM 3B-4E1T demonstrated superior performance compared to the Japanese Stable LM Base Alpha 7B, despite having only 3 billion parameters. Japanese Stable LM Gamma 7B achieved even higher scores, showcasing the remarkable advancements in Japanese natural language processing enabled by these models.

The post Stability AI Releases Two Japanese based LLMs appeared first on Analytics India Magazine.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Stability AI Releases Two Japanese based LLMs

Stability AI Japan has recently released two Japanese language models, namely “Japanese Stable LM 3B-4E1T” and “Japanese Stable LM Gamma 7B.” The former boasts approximately 3 billion parameters, while the latter is a 7 billion parameters model. These models have been made available under the Apache 2.0 license for commercial use.

日本語大規模言語モデル「Japanese Stable LM 3B-4E1T」と「Japanese Stable LM Gamma 7B」をリリースしました

約30億と70億のパラメータを持つこれらのモデルは、日本語タスクの性能評価でトップクラスです
さらに、Apache 2.0ライセンスで商用利用も可能… pic.twitter.com/N9M31UhdL0

— Stability AI 日本公式 (@StabilityAI_JP) October 25, 2023

These models are built upon previously released English language models, specifically “Stable LM 3B-4E1T” and “Mistral-7B-v0.1,” published by Stability AI in August and Mistral AI in September 2023, respectively. These models were initially trained with predominantly English data, resulting in high proficiency in English but limited Japanese language capabilities due to the scarcity of Japanese data. 

To enhance their Japanese language abilities, these models underwent continued pretraining, utilizing Japanese and English datasets from sources like Wikipedia, mC4, CC-100, OSCAR, and SlimPajama (excluding Books3), amounting to approximately 100 billion tokens.

The performance evaluation of these models followed the same methodology as the one used for “Japanese Stable LM Alpha,” released in August 2023. The evaluation included Japanese language understanding benchmarks (JGLUE) tasks, encompassing tasks such as sentence classification, sentence pair classification, question answering, and text summarization, totaling eight tasks.

The Japanese Stable LM 3B-4E1T demonstrated superior performance compared to the Japanese Stable LM Base Alpha 7B, despite having only 3 billion parameters. Japanese Stable LM Gamma 7B achieved even higher scores, showcasing the remarkable advancements in Japanese natural language processing enabled by these models.

The post Stability AI Releases Two Japanese based LLMs appeared first on Analytics India Magazine.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Accenture quarterly earnings: Accenture quarterly numbers point to strong...

India’s $254-billion technology outsourcing industry, key to helping...

Securitize proposes BlackRock BUIDL fund as collateral for Frax...

According to RWA.XYZ, BlackRock's US dollar Institutional Digital...

iPhone 17 Air suddenly makes a lot more sense...

Last week, The Wall Street Journal reported that...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!