A Conversation with Mark Lockareff: The Existential Threat Hiding in Your Data Stack

Q1.  Most executives understand they need better data and better AI models, but the movement of data isn’t typically on their radar. When you talk to boards and C-suites about why milliseconds matter to their business outcomes, what usually surprises them most about where their competitive advantage is actually being lost?

A: What surprises C-suite executives and boards the most is the existential threat posed by what I call the “half-life of data value.” It’s the realization that their competitive advantage is being lost not because of processing power, but because of moving data, which causes latency. It’s a shock to learn your entire legacy infrastructure is fundamentally incompatible with modern business requirements. The unique competitive value found in data for things like fraud detection, real-time ad bidding, and operational optimization exists for mere milliseconds and degrades instantly into historical context. But current systems, designed for batch-processing, force data through time-consuming Extract-Transform-Load (ETL) cycles, meaning they are paying millions for AI and advanced analytics that access only “cold” data hours too late. This makes the in-place analytics capability of a modern data stack an imperative.

Q2. You mentioned to me earlier exchanges that enterprises still relying on legacy, multi-system data stacks will face a stark choice: shift to unified, real-time platforms that eliminate latency bottlenecks—or watch competitors with superior data-motion capabilities capture markets, customers, and opportunities.

What does that migration path realistically look like for a large financial services firm or retailer with decades of accumulated systems? Is this a “rip and replace” moment, or can organizations modernize incrementally without losing the race?

A: Large enterprises saddled with legacy, multi-system data stacks must abandon the batch-processing mentality or lose the competitive race entirely. But the migration path can’t be a full “rip and replace.” It must be a strategic, incremental modernization that focuses on restructuring the data flow away from ETL, which is incompatible with deriving value from real-time data. To do this, companies must layer on a single, unified platform that sits in front of existing databases and is fast enough and scalable enough to hold and process data in memory and analyze it without moving it. 

Q3. Even if a company has the technical capability for real-time data, does it help if their decision-making processes, compliance reviews, and organizational culture still operate on daily or weekly cycles? How much of the real-time data advantage is actually a people and process problem rather than a technology problem?

A: The real-time data challenge definitely involves people and processes as well as technology. Even if a company integrates a unified, real-time platform capable of sub-millisecond analysis, this capability is rendered useless if the organizational culture is constrained by daily or weekly compliance reviews and human decision cycles. Real-time value requires instantaneous decisions and meeting customer needs in the moment. The biggest challenge for many companies today really isn’t the technology – it’s now available – it’s changing the company’s culture and procedures to ensure people and systems can take advantage of the speed of the new sub-millisecond latency technology.

Q4. Real-time, low-latency infrastructure isn’t cheap. How should business leaders think about the ROI calculation when investing in real-time data capabilities? What questions should they ask to determine whether their industry and use cases actually require millisecond response times versus “good enough” performance at lower cost?

A: Business leaders shouldn’t view the cost of sub-millisecond latency technology as just another expense; it’s now a requirement for competing and avoiding the risk of falling behind. The question they must ask is simple: “If our system is delayed by hours (the old batch method) or even seconds, will we frustrate customers, miss opportunities, or lose money?” If the task involves information that loses its value instantly, like blocking fraud, making a lightning-fast online ad bid, or adjusting a factory robot in under 100 milliseconds, the cost of being slow is far higher than the price of modernization.

Q5. If eliminating data movement becomes a key differentiator, which industries or types of companies do you think will be caught most off-guard? And conversely, who’s already positioned to win because they’ve been building these capabilities?

A: The companies most caught off-guard will be large, established enterprises across retail, telecom, and banking that are heavily reliant on legacy multi-system data stacks for core operations. Constrained by yesterday’s batch-processing mentality, they risk failing to capture competitive advantage “in the moment,” despite all the data they collect. Meanwhile, many financial services firms have already built the necessary low-latency infrastructure to identify arbitrage opportunities and execute trades in microseconds. This has positioned them to easily extend those systems to new problems across the enterprise, such as supply chain optimization and personalized customer engagement, accelerating innovation where speed is a critical factor.

…………………………………………………

Mark Lockareff is the CEO of GridGain, the provider of a unified real-time data storage and processing platform for transactional, analytical and AI workloads. A tech industry veteran, Lockareff has served dozens of companies as a CEO, senior executive, board member, advisor, and venture capital investor in the SaaS, enterprise infrastructure, big data, storage and cloud markets.

You may also like...