About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Fabrics Bring Speed and Agility. Just Make Sure Those Seams are Secure

Subscribe to our newsletter

The increasing complexity and volume of data used by enterprises has prompted as rethink among chief data officers about how best to manage it. The needs of more individuals within an organisation to access information that was once the preserve of a handful of managers, and the switch in emphasis from systems-derived to externally sourced data has placed pressure on traditional structures such as data lakes and warehouses.

In recent years, attention has turned to data fabric and data mesh architectures. With a focus on connectivity and outcomes, these structures envisage a world in which data resides with or near the processes that need it and where the data isn’t moved, merely accessed between connected endpoints.

Proponents argue that a data fabric can save an enterprise time and money while accelerating processing times. But observers warn it doesn’t come without risks; like clothing fabric, data fabrics have weak points where different structures are joined. Identifying and managing those seams will be essential to ensuring the total fabric’s integrity.

Peter Jackson, Chief Data and Analytics Officer at Exasol, says data fabrics and meshes have emerged as a natural evolution in data management as traditional structures have proven ill-equipped to deal with the sudden surge of information that digitalisation has generated.

“For the past 10 years, everybody has been building data lakes, data warehouses, data swamps – call them what you will – driven by the concept of big data,” Jackson tells Data Management Insight. “Well, data mesh is the reaction to that, where data warehouses haven’t hasn’t worked, or haven’t worked as well as people thought.”

The terms data fabric and data mesh are often used interchangeably. Jackson argues they are almost identical – interconnected, endpoint-focused and platform- and environment-agnostic systems. Scott Gnau, Head of Data Platforms at InterSystems, agrees, adding that the differences are tied into distribution patterns.

“They are different flavours of the same thing,” Gnau says. “The notion behind a data fabric is the connectivity but it still implies that there’s some notion of a centralised processing of the data. Data mesh is all about distributing and processing as close to the data as possible.

“I would argue that both of those things are very relevant and required as part of these new architectures, and so I would imagine that our industry will come up with some other term describe that.”

At the heart of a fabrics and meshes is the need to make data more accessible, available, discoverable, secure, and interoperable at a time when enterprises are receiving tsunamis of information from internal and outside sources. Gnau estimates that in a short space of time companies have seen their data profiles shift from solely enterprise sourced to 80 per cent obtained from external processes, including internet-connected devices and customer engagement platforms.

That’s argued for faster ability to query data that can translate into faster time to value without the need to move data.

“A data mesh is an emerging concept that is built to allow for scalable, consistent data access across the organisation,” Talend Head of Field Solutions Strategy says. “The data itself is the driving factor here and an operational view of data is built out and managed by domain experts. This talks to the concept of ‘data as a product’, where the team will be responsible for that data, but access is decentralised.

“This is a cultural change for many organisations and the data is curated by that team and they take a much greater responsibility for that data, the quality of that data, and how it is accessed.”

Capital Market Benefits

Although fabrics and meshes can be applied to any sort of organisation, capital markets companies are particularly suited to their emphasis on federated processing, speed and access.

Kieran Seaward, Head of Sales at Datactics, says that the size and complexity of banks, asset management firms and other financial services providers mean they can benefit from the agility that fabrics and meshes offer. The conceptual nature of the approach also benefits them in terms of costs because the transition doesn’t require huge new infrastructure investment.

“Capital markets organisations are often massive amounts of disparate data sources, data lakes, data warehouse business systems and often very siloed,” he tells DMI Insight. “So these, this types of approach can help capital markets organisations have better governance and centralised control of their data management, without having to rip up those existing architectures.

“And that’s the important bit because you don’t need to change the storage of that information – it’s going to remain in place, but you’re going to have this virtual layer over the top that makes it easier for you to affect change around governance, quality lineage of data without having to rip up and replace existing architectures.”

As the latest iterations of the data management evolution, Exasol’s Jackson cautions that fabrics and meshes won’t stay the same for long. The self-described “data evangelist” sees a future in which the endpoint users and custodians of data will eventually find themselves sitting on their own data lakes. Enterprises could, in the end, be stuck with the same problem that necessitated the emergence of distributed structures.

But Gnau warns of more immediate challenges, which were highlighted during the pandemic. Work-from-home orders forced companies to create ad-hoc distributed architectures. While they worked, Gnau argues, they could belie inherent weaknesses as companies, post-pandemic, seek to knit those structures together.

“In many instances, because of the time-based pressure to maintain continuity of business and customer relationships, many of those structures were built independently – and they don’t interact with each other,” he says. “So how do you start to unite these things? When you try to stitch two cloths together you get a seam, a point of weakness.

“The idea behind a data fabric is it’s somewhat seamless and provides better supportability and sustainability through whatever the next shockwave is. So how you remove those seams to make sure that this thing will work and support your business in a sustainable and predictable fashion becomes a really big a deal.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: The roles of cloud and managed services in optimising enterprise data management

Date: 14 May 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Cloud and managed services go hand-in-hand when it comes to modern enterprise data management (EDM), but how can the best solutions for the business be selected and implemented to ensure optimal data management based on accurate, consistent, and high-quality...

BLOG

Fenergo Adds Client Lifecycle Solution to AWS Marketplace

Fenergo, a provider of digital solutions for Know Your Customer (KYC), transaction monitoring and Client Lifecycle Management (CLM) has made its software-as-a-service (SaaS) CLM solution available on Amazon Web Services (AWS) Marketplace. AWS customers can streamline the procurement and purchase of Fenergo’s CLM directly within an AWS Marketplace account and unlock any AWS incentives, discounts,...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...