About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Integrated Data Access and the Shift to Data-as-a-Service Models

Subscribe to our newsletter

The key to operational efficiency and improved decision-making in financial services

By Martijn Groot, Vice President, Marketing and Strategy, Alveo.

The vast proliferation of data sources available to financial services firms is both a challenge and an opportunity. Increased volumes, as well as diversity, of data sets can provide firms with colour and additional insight on markets, firms, customers and financial products, which can help them improve their products and services and increase competitiveness.

However, to fully realise the benefits of this data deluge, financial services firms need to be more proactive in how they harness data to drive decision-making. A lot of data quietly accumulates simply as a by-product of everyday business activities, which if left unmanaged, makes it difficult for firms to derive any meaningful insights. Similarly, a lack of consensus on, or understanding of, what the firm is trying to achieve with the data it holds ultimately prevents it from being translated into the big-picture view that management needs for decision-making.

As a result, financial services firms can no longer afford to take a conservative or reactive approach to data collection and data curation. To remain competitive in an evolving landscape and keep seeing the wood for the trees, firms must up their game when it comes to speeding up the cycle of data collection and preparation, onboarding new sources of information, and leveraging data scientists to gather the real-time insights needed to expedite decision-making. This will shorten change cycles and help firms take advantage of new market opportunities.

Rethinking data collection and data preparation processes

New data sets can be provided raw, but they are often packaged in different ways with access typically via APIs or traditional, file-based delivery. If the content is raw, businesses can use methods such as Natural Language Processing (NLP) to directly extract content from text-based data. A combination of instrument or entity identifiers and keywords can then analyse the textual data to identify and extract relevant content. The curation or quality-control of data then requires the integration of multiple data sets from different sources to create a composite picture.

Traditionally, data preparation processes have often been complex, costly, and cumbersome. Data acquisition was typically shaped by monthly or quarterly reporting cycles, leading to insights that were either outdated, inaccurate or incomplete. Resolving the conundrum of drawn-out data management processes that often provide stale or inaccurate input to decision-making is required to keep pace with both external pressures such as investor or regulatory demand for information, as well as internal demands for faster decision-making. Ultimately, the inability to drive big-picture decision-making will put financial services organisations further on the back foot when it comes to keeping up with more nimble fintechs – a sector that has recently seen investment increase sevenfold.

Data collection, aggregation, curation, and analysis should be a continuous process. To steer this function efficiently, firms must first decide their objectives. This could be the regular supply of data sets for BAU operations, or self-service data collection and analysis for data scientists.  Clearly defined goals are essential, but this starts with the basics such as measuring data quality aspects including accuracy, completeness and timeliness; providing easy access to content via data catalogues, browse capabilities and APIs; as well as measuring data consumption and internal data distribution.

The next step is to put data scientists to work, get their input to gather the right data and ‘ask the right questions’. What data best suits a firm will depend on the markets, clients, and geographies the organisation currently has working relationships with. Lists of interest will then specify what needs to be collected. There will also be metadata requirements: SLAs that specify service windows, turnaround times, and quality metrics must be considered. The right technology will shrink the cycle time required for data preparation and curation, making data harvesting, data set combining and live insights a near real-time process.

A skilled data analyst can do all this and translate the data into the big picture, helping to drive decision-making.

Catering to new business requirements and shorter change cycles

Recent years have seen significant changes in data management and analytics processes within financial services. Most, if not all, business processes have become much more data intensive. Concurrently, to speed up decision-making and keep on top of data volumes, derived data and analytics have become much more pervasive within workflows.

Previously, data management and analytics were largely disparate disciplines. Data management involves activities such as data sourcing, cross-referencing and ironing out any discrepancies via reconciliations and data cleansing. Analytics is typically a subsequent process that took place after this, undertaken in various desk-level tools and libraries that are close to users and operated on a separate dataset. This divide has created problems for many financial institutions, with slower time-to-access impacting decision-making processes. New set-ups have blended these two functions, but this has put new requirements on the data management processes.

The emergence of Data-as-a-Service as the preferred sourcing model

With the data management function covering more data sources and extending into analytics, self-service capabilities provide staff across departments with easy access to the data required for decision-making.

Therefore, data management solutions providers are shifting their service offering from software to a Data-as-a-Service (DaaS) model. This is where suppliers host and run the data management infrastructure while also performing checks on the data and doing part of the curation work. Dashboards provide the end client full visibility into these processes.

DaaS can cover any data set including corporate actions, security master, issuer and pricing data, and can range from use-case specific offers to providing data services at an enterprise level. Data collection and curation typically includes the tracking of quality metrics, the cross-referencing and verification of different sources, the set-up of business rules for data quality and data derivation, and root-cause analysis on gaps and errors. Cleansed and fully prepared data then feeds into a customer’s operations, risk management, performance management and compliance.

On top of the curated data sets provided by the DaaS provider, customers’ quants and data analysts can then set up proprietary metrics to feed into decision-making.

DaaS solutions are a logical next step after managed services, which cover hosting and application management but typically do not include data curation. Firms see the value in having a one-stop shop and benefitting from the domain expertise of specialist providers that integrate data sources for a living. Firms stand to benefit from shortened change cycles, improved data quality, and transparency into the collection and verification processes. Combined with quality metrics on diverse data sets and sources, businesses will find their data operations vastly improved.

Subscribe to our newsletter

Related content


Recorded Webinar: Real world data governance – Practical strategies for data ownership

The theories of data governance and ownership are well rehearsed. Essentially, data governance includes rules and processes that make data accurate, compliant and accessible, ensuring the right users can access trusted data as and when they need it. Data ownership assigns responsibility and accountability for a specific dataset to an individual or team that can...


SimCorp Integrates Software and Services to Deliver SimCorp One

SimCorp, a subsidiary of Deutsche Börse Group, has introduced an integrated platform for the global buy-side, SimCorp One. The platform includes Dimension, Simcorp’s automated investment management software, and Axioma factor risk models, portfolio construction tools, and multi-asset class enterprise risk solutions acquired in a merger late last year. Other elements comprise client communications, the company’s...


AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.


Best Practice Client Onboarding

Client onboarding is central to the success of banks, yet it continues to present challenges and the benefits of getting it right are difficult to achieve. The challenges arise from siloed systems, manual processes and poor entity data quality. The potential benefits of successful implementation include excellent client experience, improved client acquisition and loyalty, new business opportunities, reductions in costs, competitive advantage, and confidence in compliance.