Skip to content

KDB.AI Documentation

KDB.AI is a powerful knowledge-based vector database and search engine that allows developers to build scalable, reliable and real-time applications by providing advanced search, recommendation and personalization for AI applications, using real-time data.

Vector databases can analyze and process vast amounts of text data. By converting text data to a vector format, computers can better understand and respond to natural language inputs, such as in chatbots or virtual assistants.

KDB.AI is:

  • Relevant - Improve searches with temporal and semantic context
  • Time aware - Compare data from moments in time to analyze trends or changes
  • Comparable - Relate similar data contexts through like-to-like search results
  • Real-time - Search and index your data with unmatched speed

What can KDB.AI do?

KDB.AI allows you to set-up a knowledge-based vector database and search engine in a few simple steps. With KDB.AI you can:

  • Create an index of vectors (Flat, IVF, IVFPQ, or HNSW).
  • Append vectors to an index.
  • Perform fast vector similarity search with optional metadata filtering.
  • Persist an index to disk.
  • Load an index from disk.

What are vector databases?

Vector database overview A vector database is a specialized architecture that stores vector embeddings (or lists of numeric values) to represent data points. In contrast, traditional databases typically rely on structured tables with rows and columns, where each row represents a single data entry, and each column corresponds to a specific attribute or field of the data.
Understanding indexes Vector databases utilize a crucial element known as the “Vector Index” to process data. The creation of this index involves applying an algorithm to the vector embeddings stored within the database. This algorithm functions to map these vectors to a specialized data structure, facilitating rapid searches. Searches are more efficient this way due to the index's condensed representation of the original vector data. This compactness reduces memory requirements and enhances accessibility when contrasted with processing searches via raw embeddings. You don't need to know the details of creating indices themselves with KDB.AI; they are easily created through simple commands that you choose. However, grasping the fundamental workings of vector indices and their various forms can be helpful in knowing which one to choose and when.
Understanding LLMs Large Language Models (LLMs) and AI Chatbots are one of the key uses of vector databases. Vector databases facilitate the fast retrieval of embedded contextual data, and combined with other components are the backbone for conversational AI.

Head to our Learning Hub to learn more about vector databases.

Capacity

KDB.AI Starter Edition offers the following capacities:

  • Storage - 20GB
  • RAM - 4GB
  • Query size - 10MB

When inserting data, it is recommended to keep the size under 10 MB. You can split large insertions into batches, or chunks, as this can also improve the query performance.

Get Started

You can access the KDB.AI Starter Edition in two ways: KDB.AI Cloud and Self-Managed. Go to our Pre-requisites page to find out how to get started with either version.

Support

Slack channel

For assistance using KDB.AI or if you have a question that isn't addressed here in the documentation, post a question on our community Slack channel.

Submit an idea

If you have an idea for improvement or collaboration in KDB.AI, then head to our Ideas Portal.

Support for Your KDB.AI Database

Report bugs or issues when running your database to our Support Team.