Deci Open-Sources Foundational Models, Supasses Meta’s LLaMA & Stable Diffusion

Share via:

Deci, a deep learning company building AI has launched their generative AI foundation models, which beats Meta’s algorithm by a staggering 15 times. DeciDiffusion, DeciLM 6B and its new inference Software Development Kit (SDK) are setting a new standard for performance and cost efficiency in the realm of generative AI. Unlike closed-source API models, Deci provided unrestricted access to its models that could be self-hosted anywhere. 

The computational requirements for training and inference of genAI models, hinder teams from cost-effectively launching and scaling genAI applications. With its latest releases, the Israel based company is directly addressing this gap, making scaling inference efficient, cost-effective, and ready for enterprise-grade integration. 

Better and Accurate 

AI researchers can reduce their inference compute costs by up to 80% by using Deci’s open-source models and Infery LLM. They can use the already existing and widely available GPUs such as the NVIDIA A10. The Deci models cater to diverse applications, ranging from content and code generation to image creation and chat applications, among many others.

One of their models, ‘DeciDiffusion 1.0,’ is a blazing-fast text-to-image model. It could generate high-quality images in less than a second, outperforming their competitor Stable Diffusion 1.5 model by a factor of three. 

DeciLM 6B, with its 5.7 billion parameters, set it apart by its blazing inference speed — 15 times faster than the Meta LLaMA 2 7B. Rounding out the lineup was ‘DeciCoder,’ a 1 billion parameter code generation LLM, which not only delivered exceptional inference speed but also maintained or exceeded accuracy standards.

Yonatan Geifman, Deci’s CEO and co-founder, emphasised the need for mastery over model quality, the inference process, and cost in the world of generative AI. 

These models were crafted using Deci’s proprietary Neural Architecture Search (AutoNAC) technology. Alongside its foundation models, Deci introduces Infery LLM – an inference SDK that enables developers to gain a significant performance speed-up on existing LLMs while retaining the desired accuracy. 

“With Deci’s solutions, companies receive both enterprise-grade quality and control, as well as the flexibility to customise models and the inference process according to their precise requirements” said Prof. Ran El Yaniv, Chief Scientist and co-founder of Deci.

“DeciLM-6B has set a new gold standard, outperforming Llama 2 7B’s throughput by an astonishing 15 times.This achievement is attributed to Deci’s cutting-edge neural architecture search engine, AutoNAC” said Akshay Pacchar, lead data scientist at TomTom in a tweet

The post Deci Open-Sources Foundational Models, Supasses Meta’s LLaMA & Stable Diffusion appeared first on Analytics India Magazine.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Sarthak Luthra
Sarthak Luthra
Hey, there! I am the tech guy. I get things running around here and I post sometimes. ~ naam toh suna hi hoga, ab kaam bhi dekhlo :-)

Popular

More Like this

Deci Open-Sources Foundational Models, Supasses Meta’s LLaMA & Stable Diffusion

Deci, a deep learning company building AI has launched their generative AI foundation models, which beats Meta’s algorithm by a staggering 15 times. DeciDiffusion, DeciLM 6B and its new inference Software Development Kit (SDK) are setting a new standard for performance and cost efficiency in the realm of generative AI. Unlike closed-source API models, Deci provided unrestricted access to its models that could be self-hosted anywhere. 

The computational requirements for training and inference of genAI models, hinder teams from cost-effectively launching and scaling genAI applications. With its latest releases, the Israel based company is directly addressing this gap, making scaling inference efficient, cost-effective, and ready for enterprise-grade integration. 

Better and Accurate 

AI researchers can reduce their inference compute costs by up to 80% by using Deci’s open-source models and Infery LLM. They can use the already existing and widely available GPUs such as the NVIDIA A10. The Deci models cater to diverse applications, ranging from content and code generation to image creation and chat applications, among many others.

One of their models, ‘DeciDiffusion 1.0,’ is a blazing-fast text-to-image model. It could generate high-quality images in less than a second, outperforming their competitor Stable Diffusion 1.5 model by a factor of three. 

DeciLM 6B, with its 5.7 billion parameters, set it apart by its blazing inference speed — 15 times faster than the Meta LLaMA 2 7B. Rounding out the lineup was ‘DeciCoder,’ a 1 billion parameter code generation LLM, which not only delivered exceptional inference speed but also maintained or exceeded accuracy standards.

Yonatan Geifman, Deci’s CEO and co-founder, emphasised the need for mastery over model quality, the inference process, and cost in the world of generative AI. 

These models were crafted using Deci’s proprietary Neural Architecture Search (AutoNAC) technology. Alongside its foundation models, Deci introduces Infery LLM – an inference SDK that enables developers to gain a significant performance speed-up on existing LLMs while retaining the desired accuracy. 

“With Deci’s solutions, companies receive both enterprise-grade quality and control, as well as the flexibility to customise models and the inference process according to their precise requirements” said Prof. Ran El Yaniv, Chief Scientist and co-founder of Deci.

“DeciLM-6B has set a new gold standard, outperforming Llama 2 7B’s throughput by an astonishing 15 times.This achievement is attributed to Deci’s cutting-edge neural architecture search engine, AutoNAC” said Akshay Pacchar, lead data scientist at TomTom in a tweet

The post Deci Open-Sources Foundational Models, Supasses Meta’s LLaMA & Stable Diffusion appeared first on Analytics India Magazine.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

Sarthak Luthra
Sarthak Luthra
Hey, there! I am the tech guy. I get things running around here and I post sometimes. ~ naam toh suna hi hoga, ab kaam bhi dekhlo :-)

More like this

FTC finds that smart device makers fail to make...

A new paper from the Federal Trade Commission...

India Saw UPI Frauds Worth INR 485 Cr In...

SUMMARY India saw 13.42 Lakh cases of UPI frauds...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!