Redis Enterprise Cloud Partners with Amazon Bedrock to Create Generative AI Apps  

Share via:

Redis, Inc. today announced the integration of Redis Enterprise Cloud’s vector database capabilities with Amazon Bedrock, a service designed to facilitate the creation of generative AI applications using foundation models (FMs). 

This integration will allow customers to ease the process of app creation by capitalizing on developer efficiency and scalability of a fully managed, high-performance database, while making it easy to use an array of leading foundation models (FMs) via API. 

As a vector database along with Amazon Bedrock, Redis Enterprise Cloud will offer hybrid semantic search capabilities to pinpoint relevant data. It can also be deployed as an external domain-specific knowledge base. This allows LLMs to receive the most relevant and up-to-date context which improves result quality and reduces undesirable model hallucinations.

For organizations implementing Retrieval Augmented Generation (RAG) architectures or large language model (LLM) caching, this integration eliminates the need to build their own models, customize existing models, or share their proprietary data with a commercial LLM provider. 

The integration of Redis Enterprise Cloud and Amazon Bedrock will be available through AWS Marketplace, further enhancing accessibility and convenience for developers and organizations.

“The combination of these robust serverless platforms will help accelerate generative AI application development through efficient infrastructure management and seamless scalability required by these applications.” said Tim Hall, chief product officer at Redis. 

“Customers are keen to use techniques like RAG to ensure that FMs deliver accurate and contextualized responses. This integration of Amazon Bedrock and Redis Enterprise Cloud will help customers streamline their generative AI application development process by simplifying data ingestion, management, and RAG in a fully-managed serverless manner.”  said Atul Deo, general manager of Amazon Bedrock at AWS.

Sergio Prada, CTO at Metal, highlighted the versatility and reliability of Redis Enterprise Cloud, noting its critical role in their platform’s success. This collaboration between Redis and AWS signifies a step forward in deploying Large Language Model (LLM) applications for enterprises.

The post Redis Enterprise Cloud Partners with Amazon Bedrock to Create Generative AI Apps   appeared first on Analytics India Magazine.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Sarthak Luthra
Sarthak Luthra
Hey, there! I am the tech guy. I get things running around here and I post sometimes. ~ naam toh suna hi hoga, ab kaam bhi dekhlo :-)

Popular

More Like this

Redis Enterprise Cloud Partners with Amazon Bedrock to Create Generative AI Apps  

Redis, Inc. today announced the integration of Redis Enterprise Cloud’s vector database capabilities with Amazon Bedrock, a service designed to facilitate the creation of generative AI applications using foundation models (FMs). 

This integration will allow customers to ease the process of app creation by capitalizing on developer efficiency and scalability of a fully managed, high-performance database, while making it easy to use an array of leading foundation models (FMs) via API. 

As a vector database along with Amazon Bedrock, Redis Enterprise Cloud will offer hybrid semantic search capabilities to pinpoint relevant data. It can also be deployed as an external domain-specific knowledge base. This allows LLMs to receive the most relevant and up-to-date context which improves result quality and reduces undesirable model hallucinations.

For organizations implementing Retrieval Augmented Generation (RAG) architectures or large language model (LLM) caching, this integration eliminates the need to build their own models, customize existing models, or share their proprietary data with a commercial LLM provider. 

The integration of Redis Enterprise Cloud and Amazon Bedrock will be available through AWS Marketplace, further enhancing accessibility and convenience for developers and organizations.

“The combination of these robust serverless platforms will help accelerate generative AI application development through efficient infrastructure management and seamless scalability required by these applications.” said Tim Hall, chief product officer at Redis. 

“Customers are keen to use techniques like RAG to ensure that FMs deliver accurate and contextualized responses. This integration of Amazon Bedrock and Redis Enterprise Cloud will help customers streamline their generative AI application development process by simplifying data ingestion, management, and RAG in a fully-managed serverless manner.”  said Atul Deo, general manager of Amazon Bedrock at AWS.

Sergio Prada, CTO at Metal, highlighted the versatility and reliability of Redis Enterprise Cloud, noting its critical role in their platform’s success. This collaboration between Redis and AWS signifies a step forward in deploying Large Language Model (LLM) applications for enterprises.

The post Redis Enterprise Cloud Partners with Amazon Bedrock to Create Generative AI Apps   appeared first on Analytics India Magazine.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

Sarthak Luthra
Sarthak Luthra
Hey, there! I am the tech guy. I get things running around here and I post sometimes. ~ naam toh suna hi hoga, ab kaam bhi dekhlo :-)

More like this

EU AI Act: Draft guidance for general purpose AIs...

A first draft of a Code of Practice...

Zomato To Launch INR 8,500 Cr QIP In December

SUMMARY The company has selected investment bank Morgan Stanley...

Mission Space launches next quarter to provide real-time space...

When you board a plane, the pilot already...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!