Newsletters




Weaviate Offers Suite of Solutions to Fast-Track AI Applications into Their ‘Production Era’


Weaviate, an AI-native vector database company, is releasing a developer “workbench” of tools and apps along with flexible tiered storage to help organizations put AI into production.

Inspired by Weaviate’s vibrant open source community, Weaviate’s new developer offerings accelerate AI application development and provide end-to-end solutions for some of the most common AI use cases, helping organizations make the leap from AI prototypes to production, according to the company.

The suite of tools includes:

  • Recommender app: Provides a fully managed, low-code solution for rapid development of scalable, personalized recommendation systems. Recommender offers configurable endpoints for item-to-item, item-to-user, and user-to-user recommendation scenarios and supports images, text, audio, and other forms of multimodal data.
  • Query tool: Enables developers to query data in Weaviate Cloud using a GraphQL interface. Available now through Weaviate Cloud Console.
  • Collections tool: Allows users to create and manage collections in Weaviate Cloud without writing any code. Available now through the Weaviate Cloud Console.
  • Explorer tool: Lets users search and validate object data through a graphical user interface (GUI). Coming soon to Weaviate Cloud Console.

To fuel development of new apps, Weaviate has debuted a Labs division dedicated to testing daring ideas and turning the best into Weaviate products.

Among its first projects, Weaviate Labs is developing an app to help teams quickly deploy production-ready Generative Feedback Loops for AI agents and take the next step beyond RAG.

To meet the needs of diverse AI use cases, Weaviate’s new storage tiers and tenant offloading capabilities allow users to optimize for speed, cost, or performance.

Low-latency applications closely tied to revenue, such as ecommerce and recommendation engines, can continue to be optimized for performance, while applications with higher latency tolerances, such as chatbots, can scale cost-efficiently, the company said.

“We’ve seen AI applications move into production at scale. Now the AI-native stack needs to evolve so organizations can build AI applications faster and deploy them at lower cost. We’re entering the ‘Production Era,’ where we start to see real impact from AI,” said Bob van Luijt, CEO and co-founder of Weaviate. “Listening to our community, it’s clear that to take the next step, developers need an AI-native framework with flexible storage tiers, modular GUI tools to interact with their data, and a pipeline of new concepts to spark their creativity.”

The storage options include:

  • Hot - for the highest performance read/write data access in real-time
  • Warm - to balance accessibility and cost of data that is readily available but used less frequently
  • Cold - for cost-effective long-term storage of data with infrequent use and slower access.

Weaviate’s new Query and Collections tools are now available to Weaviate Cloud users through the Weaviate Cloud Console. The new storage tiers are available for all users of Weaviate Enterprise Cloud and Weaviate Database (Open Source).

For more information about this news, visit https://weaviate.io.


Sponsors