About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

How Financial Firms Can Close the Data Integration Gap

Subscribe to our newsletter

By Sourav Moitra, a director of technology at Sapient Global Markets

Driven by the need for faster access to data, along with consolidated data views across multiple platforms with greater consistency and transparency, financial firms are realising they must change their data integration process to gain efficiency and solve business needs.

The financial industry is evolving quickly. No longer can firms afford to wait for end-of-day data or work with different datasets in silos. Buy-side and sell-side firms want to react fast to business events with easy and near real-time access to data for better business decisions.

Too often, firms allocate a significant amount of time and money to updating existing data integration platforms when the foundations themselves aren’t strong enough. One key challenge is the sheer number of systems that create data inconsistencies, including legacy systems from mergers or acquisitions.

Further, these platforms limit transparency as they are based on mostly IT-driven processes. A data-centric approach grounded in data governance, along with business-driven data processes, can help create a more collaborative and flexible platform as data requirements develop further.

Evolution of data integration

Historically, data moved from one system to another based on a particular need. However, as data requirements evolved, complexity increased. Additional processes were added, leaving most organisations with an unstructured and complicated data architecture.

So how did we arrive here? Many firms implemented different layers of a data integration platform as extract, transmit and load (ETL) jobs. But business requirements didn’t just stop progressing. More requirements were added around audit and traceability, monitoring and issue resolution, intraday reporting, historical reporting and so on. The more requirements added on top of ETL-based approaches, the more complex the integration process became.

Many responded with silo-based models, implementing solutions each time a new requirement came into effect. For instance, audit and traceability requirements led to new data governance procedures. Intraday data consumption requirements led to new services and messaging. Historical reporting requirements led to the creation of data warehouses.

The industry developed these solutions in response to a specific issue, and that resulted in current-state models with retro-fitted data governance for specific requirements, inconsistent technology, and inefficient or redundant processes, such as issue resolution.

Closing the data integration gap

For starters, you can start improving your data integration strategy with an effectively governed data-centric approach. The following steps can help close the gap:

  1. Form a canonical data model. Canonical models standardise data processes and create consistent attributes for data in motion and data at rest to reduce conflicts. This model is also source agnostic as it looks at data as a whole, as opposed to multiple systems looking at data in different formats.
  2. Combine business and technical data processes. Rather than creating specific solutions for specific requirements, form a data platform that allows you to model business and technical processes, and fuse them together. Enforce a system of records, data element standards and data quality rules.
  3. Implement loosely coupled, event-driven architecture. Implement data acquisition, validation, usage and retention policies for efficient data access to address business events in real time rather than waiting on IT.
  4. Develop a business-driven approach. Measure and monitor operational efficiency, capture metrics for data use in business processes and centralise your operational and issue resolution strategy to ensure consistency and quality.

Start small

Moving to this future state won’t be easy. However, you don’t have to accomplish everything on day one. Start small and keep evolving your platform. For example, take a dataset like securities and move it in a direction where you at least correctly implement the governance aspect. Once that is complete, bring in trades, positions and other datasets one after the other.

Steady integration can help your firm move to a future state that is agile, responsive, governed and business driven.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to simplify and modernize data architecture to unleash data value and innovation

The data needs of financial institutions are growing at pace as new formats and greater volumes of information are integrated into their systems. With this has come greater complexity in managing and governing that data, amplifying pain points along data pipelines. In response, innovative new streamlined and flexible architectures have emerged that can absorb and...

BLOG

Uncovering Data Anomalies: 16 Data Observability Solutions for Capital Markets

Financial institutions’ operational resilience depends largely on the integrity of their data and the applications it feeds. The huge volume of data that modern organisations ingest makes this a challenge. The accuracy, completeness and timeliness of critical data can be improved if it is monitored and checked as it moves through increasingly intricate data pipelines...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities. These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency...