AI, Barclay, Cloud, Exasol, Gartner, Generative AI, Hybrid cloud, LLMs, Mathias Golombek
On Hybrid Cloud. Interview with Mathias Golombek.
“Cloud is a tool, not a destination. Hybrid is the strategy that delivers both control and agility.”
Q1. What is your role and current projects at Exasol?
Mathias Golombek: As CTO of Exasol, I oversee the technical direction of our high-performance Analytics Engine, ensuring it delivers speed, scalability, and cost efficiency across on-premises, hybrid, and cloud environments. My role involves driving innovation in query performance, self-tuning optimizations, and seamless data integration to help organizations maximize the value of their analytics.
Right now, we’re focusing on:
- Enhancing hybrid data integration, making it easier for companies to run analytics across on-prem and hybrid environments without performance trade-offs.
- Optimizing our query execution engine, improving parallel processing, indexing strategies, and workload balancing to ensure consistently fast performance.
- Expanding AI/ML capabilities, enabling advanced analytics workloads directly within the database without the need for additional infrastructure.
- Improving cost efficiency, refining storage and memory management to reduce operational costs while maintaining top-tier performance.
These initiatives ensure Exasol remains the most powerful, flexible, and cost-effective analytics solution for data-driven organizations.
Q2. Choosing the right infrastructure is a strategic decision that will impact an organization for years. What practical tips can you offer in this area?
Mathias Golombek: The key to making the right infrastructure choice is understanding workload requirements, regulatory constraints, and total cost of ownership (TCO). Organizations should ask:
- Performance Needs: If real-time analytics and low-latency queries are critical, an in-memory Analytics Engine like Exasol can provide superior performance than other databases.
- Data Governance & Compliance: If strict data residency and compliance laws apply (e.g., GDPR, HIPAA), an on-premises or hybrid approach may be necessary.
- Cost Predictability: Cloud costs can spiral if not managed effectively. Organizations should model workloads and compare TCO across on-prem, hybrid, and cloud options.
- Scalability & Integration: Consider the need for seamless integration with existing tools and the ability to scale without costly re-architecture.
- Future-Proofing: The landscape evolves rapidly—opting for an infrastructure that supports flexibility (on-prem, cloud, hybrid) ensures long-term adaptability.
At Exasol, we see organizations increasingly favoring a hybrid-first approach, leveraging on-prem for mission-critical workloads while optimizing cloud usage for elasticity and burst processing.
Q3. Let’s talk about migrating to the Cloud: Why, How and What Makes Sense?
Mathias Golombek: Cloud migration is often driven by scalability, elasticity, and ease of management. However, it’s not a one-size-fits-all solution. The key considerations are:
- Why migrate? Organizations move to the cloud for agility, operational simplification, and dynamic scaling. However, performance-sensitive workloads may not see cost or speed benefits.
- How to migrate? A phased approach works best—starting with non-critical workloads, leveraging hybrid setups, and optimizing data architecture to prevent unnecessary cloud egress costs.
- What makes sense? A cloud-smart strategy rather than a cloud-first mandate. Many organizations are now repatriating workloads due to unpredictable costs and performance inefficiencies. Workloads requiring low latency, predictable costs, and high security often perform best on-prem or in a hybrid model.
Exasol supports flexible deployment, allowing organizations to run the same high-performance analytics across on-prem, hybrid, or cloud environments—giving them the ability to adjust strategies as needed.
Q4. On the contrary, what is the biggest advantage of on-premises computing?
Mathias Golombek: The biggest advantage of on-premises computing is predictability—in cost, performance, and security.
- Performance Optimization: On-prem allows full control over hardware and resource allocation, minimizing latency and delivering consistent high-speed analytics.
- Cost Efficiency at Scale: While cloud pricing is attractive for small workloads, long-term costs often escalate due to unpredictable storage, compute, and egress fees. A well-optimized on-prem solution has a lower total cost of ownership (TCO) over time.
- Data Control & Compliance: Industries like healthcare, finance, and government require stringent data sovereignty, regulatory compliance, and security—challenges that cloud providers can’t always meet.
- Minimal Vendor Lock-in: Cloud providers have proprietary ecosystems that can make data migration complex and costly. On-premises solutions allow full control over data access, storage, and portability.
For organizations running high-performance analytics on large datasets, Exasol’s in-memory Analytics Engine on-premises consistently outperforms cloud alternatives while maintaining cost predictability and compliance advantages.
Q5 According to a Barclay’s CIO survey, 83% of enterprise CIOs planned to repatriate at least some workloads in 2024. What are the reasons why companies are choosing to bring their data in-house?
Mathias Golombek: We’ve seen this shift in our own customer base. The primary drivers for workload repatriation include:
- Cost unpredictability: Cloud egress fees and unpredictable pricing models have made on-prem/hybrid more attractive for long-term analytics workloads.
- Security & control: The rise of AI and sensitive data analytics has made many organizations reconsider who controls their data and how it’s stored, processed, and accessed.
- Performance bottlenecks: Latency and performance inconsistencies in shared cloud environments make real-time analytics and high-concurrency workloads challenging.
- Regulatory compliance: Industries like banking, healthcare, and telecom face increasing data sovereignty and privacy regulations, making on-premises or hybrid solutions more viable.
- Many of these shifts and their implications have been widely discussed: hybrid strategies—where compute happens on-prem while scalability is extended via cloud—are now the preferred model.
Exasol excels in hybrid environments by providing high-speed analytics while ensuring full control over data location and processing.
Q6. According to the latest forecast by Gartner, 90% of organizations will adopt a hybrid cloud approach through 2027. What is your take on this?
Mathias Golombek: The hybrid cloud model is not just a transition phase—it’s the future of enterprise IT.
- Best of both worlds: Companies are realizing that on-prem is critical for cost efficiency, performance, and compliance, while cloud provides agility and elasticity.
- Cloud is not always cheaper: Many organizations initially moved to the cloud expecting lower costs but are now balancing workloads between cloud and on-prem to optimize spend.
- Interoperability is key: Businesses need infrastructure that integrates seamlessly across on-prem, private cloud, and public cloud without vendor lock-in.
At Exasol, we design for hybrid-first strategies—enabling organizations to scale analytics seamlessly across on-prem, hybrid, and cloud without sacrificing speed or cost efficiency.
The key takeaway? Cloud is a tool, not a destination. Hybrid is the strategy that delivers both control and agility.
Q7. In your opinion what are hybrid cloud benefits?
Mathias Golombek: A hybrid cloud strategy combines the best aspects of on-premises and cloud computing, offering organizations flexibility, performance optimization, and cost efficiency while maintaining control over security and compliance. The key benefits include:
- Optimized Workload Placement: Certain workloads—such as real-time analytics and high-concurrency queries—perform better when executed on-premises due to low-latency in-memory processing and predictable performance. Cloud resources can be leveraged for burst capacity, external data ingestion, or long-term storage.
- Cost Efficiency & Resource Utilization: High-performance engines like Exasol can minimize compute overhead in an on-prem deployment while still integrating with cloud object storage for cost-effective data retention.
- Data Sovereignty & Compliance: Many industries—healthcare, finance, public sector, and telecommunications—require strict data residency controls. Hybrid cloud enables organizations to process and store sensitive data on-prem while leveraging cloud services for non-sensitive workloads.
- Scalability & Elasticity: Organizations can dynamically scale resources by leveraging the cloud for compute-heavy tasks (such as machine learning inference) while keeping mission-critical workloads running on-prem for predictable performance.
At Exasol, we optimize for hybrid deployments, ensuring seamless data virtualization, query federation, and cross-platform analytics without performance degradation.
Q8. The most common hybrid cloud example is to use public cloud with private cloud services and on-premises infrastructure. What is your take on this?
Mathias Golombek: A hybrid cloud model that combines public cloud, private cloud, and on-premises infrastructure is increasingly becoming the standard. However, the key challenge is not just deployment but ensuring seamless workload portability and data interoperability across environments.
- Latency & Performance Considerations – High-performance analytics workloads often require low-latency query execution, which is best achieved with in-memory, on-premises infrastructure or high-performance private cloud deployments rather than public cloud services optimized for storage rather than compute.
- Data Gravity & Egress Costs – Moving data between environments introduces latency penalties and unpredictable cloud egress costs. Organizations must optimize data locality and workload placement to minimize transfer inefficiencies.
- Security & Compliance – Private cloud helps enforce data sovereignty and regulatory mandates, but integration with public cloud analytics tools often leads to security trade-offs and additional access control requirements.
- Cross-Platform Query Execution – A hybrid approach only works effectively when databases support federated query execution, virtualization, and schema bridging, ensuring that data silos are avoided, and workloads can scale efficiently across environments.
I see hybrid architectures not as a static setup but as an evolving, workload-aware strategy. Exasol’s Analytics Engine enables high-speed analytics across hybrid infrastructures by minimizing query latency, optimizing data locality, and integrating seamlessly with cloud and on-prem ecosystems—allowing organizations to maximize performance without unnecessary complexity.
Q9. What are the requirements to build and deploy Generative AI workloads?
Mathias Golombek: Deploying Generative AI (GenAI) workloads effectively requires a combination of high-performance compute, scalable storage, optimized data pipelines, and inference acceleration. The key requirements include:
High-Performance Compute Infrastructure
- Parallel Processing & MPP Architectures: Training and running large foundation models require distributed computing frameworks to optimize vectorized execution and parallel workloads.
- GPU & TPU Acceleration: Many transformer-based models rely on GPU/TPU acceleration for efficient matrix multiplications and tensor operations.
Scalable & High-Speed Storage
- Hybrid & Multi-Tiered Storage: Storing training datasets in a combination of on-prem NVMe storage (for high-speed access) and cloud object storage is a common approach.
- Data Lake Integration: Exasol’s query engine can be used to process structured and semi-structured data efficiently, ensuring high-throughput data preparation for AI pipelines.
Optimized Data Management & Feature Engineering
- Federated Data Access: GenAI models require diverse datasets—ranging from structured enterprise data to unstructured text, images, and videos. Hybrid environments must support fast ETL processes and federated queries across multiple sources.
- Vectorized Execution & Feature Store: Efficient feature engineering requires databases that support vectorized processing, indexing, and real-time transformations, with integration options for feature storage and retrieval in AI/ML workflows.
Inference Optimization & Model Deployment
- Inference Optimization & Data Access: AI workloads require efficient data retrieval and transformation pipelines. Exasol enables near real-time analytics and feature engineering for AI models while integrating with external ML platforms for model training and inference.
- Real-Time AI Integration: Using high-speed analytical databases like Exasol ensures that GenAI models can query and process real-time data without performance bottlenecks.
Security, Compliance, & Governance
- Data Sovereignty & Compliance Controls: Many AI workloads process sensitive PII data, requiring on-prem data governance while allowing cloud-based AI training.
- RBAC & Secure AI Pipelines: Implementing role-based access control (RBAC), model versioning, and explainability frameworks ensures AI transparency and compliance with industry standards.
How does this work in practice? For example, with Exasol, users can integrate with LLMs in 3 ways:
- Exasol In-database LLM Deployment:
Download your chosen language model into Exasol’s internal file system (BucketFS) and access it via User Defined Functions (UDFs). This method guarantees that your data, queries, and prompts remain securely within your environment, minimizing exposure to external networks.
- Connect to locally hosted LLM:
Integrate with LM Studio and other language model services managed within your own network and infrastructure for a balance of security and flexibility.
- API-Based Integration:
Connect directly to external language model APIs using UDFs. This option provides rapid access to the latest models without the need for local deployment, offering flexibility and speed.
We focus on accelerating AI-driven analytics by providing low-latency, high-performance query processing, ensuring efficient data preparation, real-time feature engineering, and on-premises and hybrid AI deployments.
Qx. Anything else you wish to add?
Mathias Golombek: As organizations continue to evolve their hybrid and AI-driven analytics strategies, the focus should be on:
- Workload-specific infrastructure choices rather than forcing cloud adoption where it doesn’t provide cost or performance benefits.
- Optimizing structured data processing to support AI-driven insights and decision-making while ensuring seamless integration with external unstructured data sources.
- Minimizing operational complexity by leveraging self-tuning, high-performance analytics engines that seamlessly integrate across on-prem, cloud, and hybrid environments.
At Exasol, we are committed to pushing the boundaries of analytics performance, ensuring organizations can extract real-time insights from massive datasets while optimizing cost, scalability, and security.
………………………………………….

Mathias Golombek
Mathias Golombek is the Chief Technology Officer (CTO) of Exasol. He joined the company as a software developer in 2004 after studying computer science with a heavy focus on databases, distributed systems, software development processes, and genetic algorithms. By 2005, he was responsible for the Database Optimizer team and in 2007 he became Head of Research & Development. In 2014, Mathias was appointed CTO. In this role, he is responsible for product development, product management, operations, support, and technical consulting.
…………………………………………………………..
Related Posts
On Generative AI. Q&A with Bill Franks
………………………………………………………….
From → Uncategorized
Comments are closed.