On Applying Data Fabrics Across Industries. Q&A with Joe Lichtenberg.

” A data fabric is an architectural pattern that strives to create consistency among all data and metadata in an organization to make data easy to find, access, and use.  A data fabric can be a critical component of a successful data governance program. “

Q1. Are data governance and data fabric the same thing? 

Data governance and data fabric are related, but they are very different.  Data governance is an overarching set of initiatives that strive to define and enforce the quality, usage, and security of data within an organization.  It includes policies, standards, processes, rules, roles and responsibilities, privileges and more.  In contrast, a data fabric is an architectural pattern that strives to create consistency among all data and metadata in an organization to make data easy to find, access, and use.  A data fabric can be a critical component of a successful data governance program.

Q2. What problems does data fabric solve for the financial services industry? 

We see many practical applications of a data fabric with our financial services customers, however the general problem it solves is to provide an organization with a modern data architecture that ensures that all consumers of the data have access to a consistent set of accurate, current, trusted, and secure information.  

In general, a data fabric provides a modern approach to create one single source of truth from all the disconnected, disparate, and dissimilar data sources inside and outside the organization that feeds all consumers of the data, whether that’s business users, applications, data scientists, clients, regulators, and so on.  It also provides a consistent and overarching metadata layer, and a semantic layer that maintains relationships among the various data and metadata.  A data fabric can eliminate the errors and redundancies introduced by maintaining multiple individual data repositories that serve different consumers of the data.  It should allow data to be optionally persisted or virtualized (not persisted), handle real time streaming data as well as batch data at scale, be able to natively manage a wide variety of data types including unstructured data (multi-model), and have embedded analytics to enable real time advanced analytic processing without moving the data to a different environment for analytics (smart data fabric).  

Q3. Jey Amalraj, CTO of Harris Associates, a financial services asset management firm with $100 billion in assets under management, presented a Keynote at the InterSystems Global Summit 2023, focusing on “Leveraging a Smart Data Fabric for Financial Services”. What was his main message? 

Harris Associates is one of our customers, and Jey has been leading data management initiatives in the financial services industry for decades.  Their main requirement is exactly what I mentioned earlier, which is to create a single source of truth spanning all data sources that serves all of their consumers of the data.  He has worked with most of the different data management technologies over the course of his career, including point to point integrations, integration platforms, data marts, data lakes, and so on. 

 His quote regarding data fabrics is, “I’ve been working with data for 25 years. We’ve been through a few solutions. We’ve finally found something which works.” 

Q4. How do you implement a data fabric for financial services? 

A key attribute of data fabrics is that they’re non-disruptive to an organization’s existing technical infrastructure.  They connect the existing technologies, including applications, data streams, databases, data warehouses, data lakes, etc. without requiring any “rip-and-replace.”  A good implementation approach is to define well-scoped projects that can provide measurable business value in the short term (a few weeks or months) that expose and connect data that is ripe for re-use for future projects, and work in an incremental manner, avoiding multi-year big bang implementations.  For those of us that have been around for a while, this is exactly how we approached service-oriented architecture initiatives in the late nineties and early 2000s.

There are many ways to implement a data fabric.  One way is to implement and integrate many different data management point solutions, for example for relational and non-relational database management, integration, caching layer, data catalog, workflow, business intelligence, machine learning, metadata and semantic data management, etc.  We’ve seen that organizations that try this approach usually end up with a complex and inefficient architecture that is slow to deploy, difficult to maintain, lacks performance, and is inefficient in its use of infrastructure resources.  Instead, a recommended approach is to look for data platform technology that provides many of the required functionality in one single product or platform.  One of our customers, a $5B fintech software provider, has been able to replace eight different technologies with a single product, gaining nine times better performance running on only 30% of the infrastructure, and with a far simpler architecture.

Q5. Let’s talk about supply chains. Supply chains generate vast amounts of data from various sources. Can a data fabric model across the IT ecosystem be used also to solve supply chain challenges? 

Absolutely! Supply chains are a perfect domain for data fabrics because they are large, disparate, and complex, spanning many different organizations, all with their own dissimilar data and application stacks.  Organizations require real time visibility across the end-to-end supply through distribution continuum to easily understand the status of millions or potentially billions of components and react to unexpected issues and disruptions as they occur.

Q6. What about disruptions to supply chain operations? 

Handling disruptions quickly and efficiently is the top issue in supply chain operations.  Disruptions are a constant occurrence, and one of the most challenging supply chain related issues that organization must deal with.  An intelligent control tower must provide not only real time end to end visibility, it must also deliver predictive insights into the likelihood of disruptions, calculate the impact on the business, and present a set of data-driven prescriptive options for preventing potential disruptions in advance, or handling them in real time when they occur.  For example, geopolitical events, labor shortages, supply failures, weather patterns, and rapidly changing consumer demand can all impact supply and demand.  Organizations can accelerate data driven decision-making by leveraging a data fabric with embedded analytics to achieve a higher level of decision support and automation-driven outcomes.

Q7. Is a data fabric strategy a path to a digital supply chain transformation? If yes, what does it mean in practice? 

Yes, very much so.  Most organizations are moving to an “analytics and decision intelligence” data platform strategy to meet their digital transformation goals in supply chain.  To do so, it requires a modern architecture that can harmonize and normalize data from any disparate data source in real-time, simulate business processes and provide AI and ML capabilities to enable dynamic optimized decision making at the line of business level.  In practice, there are industry standard digital maturity models that can provide guidance.  The progression starts with understanding customer needs and critical KPIs, then leveraging a foundational data fabric architecture and developing processes to incrementally progress to the higher levels of digital maturity, which is achieving a predictive, autonomous, and adaptive supply chain.

Q8. Do you have any examples to share with us? 

Of course, we have many examples of customers that are leveraging a smart data fabric in supply chain to achieve outstanding results.  One of our customers is the largest wholesaler of drugs and cosmetics in Japan.  They distribute 50,000 different products from 1000 different manufacturers to 400 different retailers that operate more than 50,000 stores per year.  That’s a total of 3.5 billion products every year!  Using this approach, they’re achieving 99.999% On Time In Full (OTIF) delivery accuracy, compared with a 65% industry average.  That means that for every 100,000 products they deliver, 99,999 are delivered to the customer both on time, and in full.  That’s an incredible achievement.

Q9. Let’s now look at data fabric applicability in manufacturing industries. What are the benefits of using data fabric for the manufacturing industries? 

Industry 4.0 is all about digitizing the manufacturing environment and enabling OT / IT convergence to streamline the entire process chain and improve efficiencies and responsiveness.  And it’s not just creating digital twins for the factory.  A data fabric can span supply through manufacturing, assembly, and distribution, including SCP, MRP, MES, ERP, CRM, PLM, inventory management, and more to provide true end to end visibility.  And just as with supply chain, a smart data fabric that provides advanced analytics capabilities embedded within the fabric can provide predictive and prescriptive analytics, for example to inform predictive maintenance to keep critical production lines running, to balance supply with predicted fluctuations in demand, and to optimize staffing.  

Qx. Anything else you wish to tell us? 

Many industry analysts are promoting the data fabric architecture as the preferred approach for many use cases, especially where there is a lot of disparate and dissimilar data to be managed.  However, it can be overwhelming to get started.  We recommend that the technical teams in an organization work closely with stakeholders in the line of business to identify the use cases that can bring the most value to the organization and implement in sprints that each provide some measurable business value.  We also recommend working with a trusted partner that has proven experience with similar organizations and use cases to help with strategy, best practices, and implementation.  


Joe Lichtenberg, Director of Product and Industry Marketing, InterSystems.

Joe Lichtenberg is responsible for product and industry marketing for data platform software at InterSystems. Joe has decades of experience working with various data management, analytics, and cloud computing technology providers.


On Data Fabric and Data Mesh. Q&A with Jeffrey Fried, ODBMS.org, OCTOBER 10, 2023

Sponsored by InterSystems.

You may also like...