On why a vector database is essential to scale generative AI apps. Q&A with James Corcoran, Chief Growth Officer, KX

If generative AI represents significant business potential for an organization’s future, existing databases will hold that company back.

Q1: Vector databases are a hot topic, do we really need another category of DB to maximise the opportunities presented by generative AI?

Short answer is yes! If generative AI represents significant business potential for an organization’s future, existing databases will hold that company back. Very few companies still use technology from a half-century ago yet many of today’s relational databases were designed back in the 1970’s. They still do an incredible job for what they were designed for – numbers and text organized in rows and columns – but generative AI requires a far greater set of capabilities, principally the ability to perform vector processing across multiple use cases.

Vector databases and AI are an ideal pairing, since AI generally, and large language models (LLMs) specifically, rely on processing data in vectors and matrices. This isn’t a programming preference; it’s just how the fundamental mathematics of AI works, most notably but not exclusively in the deep neural networks that are at the heart of the LLM. 

Both RDBMS and data lakes can play a role in AI – as a store of data to feed into an AI app or as a storage location for the same app’s output – but they are at best a sub-optimal fit for this kind of data. 

However, like any hot trend, it’s important to understand the strengths and limitations of this new breed of database, not all are created equal.  

Q2: Why are traditional relational databases insufficient for handling the data demands of generative AI?

It’s estimated that 80 to 90 percent of the world’s data is unstructured – such as audio, images, video, and social media posts – which these databases were not designed to handle. Data lakes can store unstructured data but have limited computation abilities for these data types. Thus, for the purposes of generative AI, which relies on processing data in vectors and matrices, relational databases are at best a sub-optimal fit.

Q3: What makes vector databases a better fit for generative AI?

Vector databases offer several advantages over traditional relational databases, even ones that can accommodate vector embeddings.

  • Ideal for AI applications: Designed to ingest structured and unstructured data, vector databases can process data and run AI models on it. They can hold raw data suitable for AI analysis, such as time series, and support search from prompts more effectively than alternatives that merely simulate vector databases.
  • Optimized for temporal data: By organizing data based on time, vector databases allow users to playback and inspect historical data, helping to understand system behavior, anomaly detection, or trade unravelling. This enables real-time transaction management, risk modeling, reconciliation, and informed predictive maintenance and health workflows.
  • High efficiency: Vector databases deploy search queries like approximate nearest neighbor search over indices and vector embeddings to accelerate retrieval speeds, allowing vector-based workloads to be processed 100 times faster than traditional data stores at a fraction of the query cost. They also enable querying more dimensions of data, such as time-stamped data, providing greater insights.
  • Continuous improvement: Vector databases leverage native capabilities of CPU architectures like multicore processors, neural processing units, and heterogeneous computing (GPU + CPU on the same chip). For instance, GPUs are natively vector-based, making them suitable for optimized search, but the best vector databases can also leverage optimal compute in-memory.
  • No reliance on GPUs: Vectorized programming and storage in vector databases do not necessarily require GPUs, saving cost and resources, especially given the increasing cost of computing and the availability of GPU chips.

Q4: What value can businesses realize by using vector databases for generative AI applications?

Vector databases offer significant value, particularly for time-sensitive data. They enable:

  • Improved customer experience: AI powered by vector processing can provide personalized recommendations faster and offer better service by understanding the customer-vendor relationship.
  • Faster innovation: By identifying patterns based on historical and real-time data, companies can confidently predict market trends, make better strategic decisions, and develop innovative products that meet customer needs.
  • Higher productivity: Timely insights help organizations, such as manufacturers and medical organizations, reduce waste and improve output, empowering individual operators to take corrective action in the moment and allowing managers to monitor processes and quality continuously.

Q5: What should organizations consider when implementing vector databases?

When introducing vector databases, treat the proof of concept as a pilot, focusing on applications with broad possibilities rather than highly specific use cases. Plan for subsequent use cases and build levels of abstraction into the pilot for smoother scaling across the enterprise. When choosing a vendor, prioritize proven ones with a track record, great support, responsive customer service, and a proper customer success team with deep experience in data modeling and preparation. Consider all factors such as planning, architecture, thought, advice, and governance when selecting a vendor for generative AI. Avoid relational databases and start with vectors, but look beyond vector embeddings and generated content into golden use cases that make a difference.


James Corcoran, Chief Growth Officer, KX

As Chief Growth Officer, James is responsible for driving strategic and operational initiatives to accelerate KX’s growth and for building the KX developer and user community across the world. He joined First Derivatives in 2009 and spent several years building front office trading and analytics platforms, using KX technology, for many of the world’s largest financial institutions. He was subsequently appointed to a number of leadership positions, including CTO, and led the global expansion of the product and solution engineering teams through a combination of organic growth and the integration of strategic acquisitions. He holds an MSc in Quantitative Finance from University College Dublin.


Sponsored by KX.

You may also like...