How will the GenAI/LLM database market evolve in 2024. Q&A with Madhukar Kumar

Q1. What were in your opinion the main trends for the database in 2023?

At the end of 2022 and the beginning of 2023, I noticed several interesting trends happening in the database ecosystem. In particular, a critical focus on upgrading and incorporating newer technologies into data warehouses. With data setups being pretty expansive, companies across various industries were utilizing a blend of legacy technologies that, while effective, also posed challenges in terms of scalability and manipulation. To combat these challenges and enhance database management speed and efficiency, many industry experts sought out:

  • Cloud adoption as it not only streamlines storage, integration and automation but also enables organizations to do this in a more cost-effective manner by eliminating time spent on mundane tasks. This also allows for simplified and streamlined security and analytics.
  • Incorporating diverse databases, ranging from data warehouses to NoSQL setups, not only to address the rising cloud costs but also enable organizations to leverage applications already present in their tech stack that better align with specific database types.
  • Deploying machine learning infrastructures remained a significant focal point, even though it posed challenges for many organizations due to evolving regulations and ongoing learning curves. However, this allowed automated workflows, streamlined problem resolution and enhanced decision support, ultimately decreasing the reliance on manual intervention.

While these trends held a strong foothold in 2023, I believe we will continue to see them transforming into new iterations as we approach 2024. The database ecosystem evolves swiftly, with tech continuously evolving and influencing the latest innovations.

Q2. What main lessons did you learn this year by talking with your customers?

It’s no surprise that this year has been all about AI. I have been spending time talking with customers about their vision for how AI can support their operations and how they can make this happen using real-time and agile data management processes. Here’s some other learnings I gathered from these invigorating chats: 

  • Almost all customers are evaluating generative AI: It feels like everyone of all ages know about generative AI. The hype around generative AI is at an all-time high – but what does this all truly mean? We will discover these new possibilities for years to come as more and more companies start to deploy their AI applications to production.
  • Identifying business use cases for AI is a top priority: Our customers want to uncover ways that AI can alleviate repetitive and time-consuming tasks.  The important thing is that AI serves a purpose, and it is not just implemented for the sake of having it. Intentional deployment of AI is the focus for many organizations moving into the new year. 
  • Finding scalability with simplicity: Scalability has been a significant focus for our customers. Complex architectures often hinder the ability for customers to expand their AI deployments. Using an unified architecture that transacts, analyzes and contextualizes data in a single location can help companies scale their AI projects with ease. This is how SingleStore can help organizations expand their AI capabilities. 

Q3. How will the database market evolve in 2024? 

As organizations integrate AI into their operations, the need for highly scalable and secure data storage and database management systems will only increase. First, vectors and semantic search will become table stakes for all existing databases. In addition, we will see some consolidation in the vector-only databases space even as most enterprise generative AI applications will use a Retrieval Augmented Generation (RAG) pattern. 

I also anticipate real-time data integrations becoming more important for most generative AI applications and services as AI becomes multimodal. The database market will also evolve due to increased regulatory efforts. This will likely spur more innovation in the RAG data management and retrieval patterns. For example, organizations may improve systems to ensure more control and visibility over data provenance, data access, data governance, auditing and security. This would all need to scale with extensive datasets to ensure LLMs are up to par with evolving regulations. 

Q4. Will Generative AI go mainstream in 2024?

Some could argue that generative AI is already mainstream, given the hype surrounding it throughout the year. However, moving beyond the buzz, I believe that generative AI will become more commonplace in 2024. 

For example, I expect that we’ll see better LLM models emerge, especially in the open-source community and get smaller to run on offline devices. Enterprises like Meta and Microsoft are already on the cutting edge of AI development, and I look forward to seeing their new innovations in 2024. 

Along with improved LLMs, we’ll likely see a proliferation of specialized and niche AI services tailored to specific industries and use cases, similar to initiatives like BloombergGPT. Also, I anticipate more organizations introducing LLMs to assist humans with their daily workloads. We have already seen major technology companies such as Microsoft and Amazon release AI upgrades and chatbots to act as co-pilots and assistants to support workers. I believe the trend will continue in the new year as opposed to companies deploying fully autonomous systems. 

Another exciting innovation that I foresee in the near future is LLMs integration into physical devices. Given the fact that GPUs on laptops and desktops are improving (e.g. Apple’s M3 chip using dynamic caching and the upcoming release of Intel’s Meteor Lake Chip), we will see LLMs on laptops and other home and wearable devices like Meta’s Ray Ban glasses. And these systems will have the same level of question-and-answer capabilities given contextual data as we have with ChatGPT. 

In software development, we have already seen the building blocks of Artificial General Intelligence (AGI) in the form of Assistants or Agents and next year we should see more maturity in the development of frameworks to orchestrate these Agents and Assistants to do complex tasks.

Q5. And if yes, who will be the big beneficiaries of Generative AI?

The biggest beneficiaries of generative AI will be the companies that create the revolutionary technology and the people who use it to alleviate time-consuming and repetitive tasks. Generative AI has the potential to spur creativity, curiosity and innovation. From brainstorming ideas to supporting decision-making, this technology has already shown how transformative it can be to the way we work, shop and live. Those on the frontlines of generative AI development will champion a future where AI becomes a staple in our daily lives just like social media. 

Q6. Anything else you wish to add?

I’m looking forward to seeing how the AI landscape will change in the new year. As more enterprises add features to their products, it may lead to some shake-ups in the startup environment. Companies that emerged in 2023 as niche and specific use-case driven apps may face some setbacks as large companies integrate their own AI initiatives. 

However, there are still plenty of opportunities for startups to succeed, especially ones that develop high-quality responses from topnotch LLMs. I’m also excited for more startups to enter the AI and Mixed Reality space, where AI-based technologies can work together with robotics, graphics and machine learning (ML) to merge the virtual and physical worlds. 

………………………………………………….

Madhukar Kumar is the Chief Marketing Officer at SingleStore. 

Sponsored by SingleStore.

You may also like...