About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Integrated Data Access and the Shift to Data-as-a-Service Models

Subscribe to our newsletter

The key to operational efficiency and improved decision-making in financial services

By Martijn Groot, Vice President, Marketing and Strategy, Alveo.

The vast proliferation of data sources available to financial services firms is both a challenge and an opportunity. Increased volumes, as well as diversity, of data sets can provide firms with colour and additional insight on markets, firms, customers and financial products, which can help them improve their products and services and increase competitiveness.

However, to fully realise the benefits of this data deluge, financial services firms need to be more proactive in how they harness data to drive decision-making. A lot of data quietly accumulates simply as a by-product of everyday business activities, which if left unmanaged, makes it difficult for firms to derive any meaningful insights. Similarly, a lack of consensus on, or understanding of, what the firm is trying to achieve with the data it holds ultimately prevents it from being translated into the big-picture view that management needs for decision-making.

As a result, financial services firms can no longer afford to take a conservative or reactive approach to data collection and data curation. To remain competitive in an evolving landscape and keep seeing the wood for the trees, firms must up their game when it comes to speeding up the cycle of data collection and preparation, onboarding new sources of information, and leveraging data scientists to gather the real-time insights needed to expedite decision-making. This will shorten change cycles and help firms take advantage of new market opportunities.

Rethinking data collection and data preparation processes

New data sets can be provided raw, but they are often packaged in different ways with access typically via APIs or traditional, file-based delivery. If the content is raw, businesses can use methods such as Natural Language Processing (NLP) to directly extract content from text-based data. A combination of instrument or entity identifiers and keywords can then analyse the textual data to identify and extract relevant content. The curation or quality-control of data then requires the integration of multiple data sets from different sources to create a composite picture.

Traditionally, data preparation processes have often been complex, costly, and cumbersome. Data acquisition was typically shaped by monthly or quarterly reporting cycles, leading to insights that were either outdated, inaccurate or incomplete. Resolving the conundrum of drawn-out data management processes that often provide stale or inaccurate input to decision-making is required to keep pace with both external pressures such as investor or regulatory demand for information, as well as internal demands for faster decision-making. Ultimately, the inability to drive big-picture decision-making will put financial services organisations further on the back foot when it comes to keeping up with more nimble fintechs – a sector that has recently seen investment increase sevenfold.

Data collection, aggregation, curation, and analysis should be a continuous process. To steer this function efficiently, firms must first decide their objectives. This could be the regular supply of data sets for BAU operations, or self-service data collection and analysis for data scientists.  Clearly defined goals are essential, but this starts with the basics such as measuring data quality aspects including accuracy, completeness and timeliness; providing easy access to content via data catalogues, browse capabilities and APIs; as well as measuring data consumption and internal data distribution.

The next step is to put data scientists to work, get their input to gather the right data and ‘ask the right questions’. What data best suits a firm will depend on the markets, clients, and geographies the organisation currently has working relationships with. Lists of interest will then specify what needs to be collected. There will also be metadata requirements: SLAs that specify service windows, turnaround times, and quality metrics must be considered. The right technology will shrink the cycle time required for data preparation and curation, making data harvesting, data set combining and live insights a near real-time process.

A skilled data analyst can do all this and translate the data into the big picture, helping to drive decision-making.

Catering to new business requirements and shorter change cycles

Recent years have seen significant changes in data management and analytics processes within financial services. Most, if not all, business processes have become much more data intensive. Concurrently, to speed up decision-making and keep on top of data volumes, derived data and analytics have become much more pervasive within workflows.

Previously, data management and analytics were largely disparate disciplines. Data management involves activities such as data sourcing, cross-referencing and ironing out any discrepancies via reconciliations and data cleansing. Analytics is typically a subsequent process that took place after this, undertaken in various desk-level tools and libraries that are close to users and operated on a separate dataset. This divide has created problems for many financial institutions, with slower time-to-access impacting decision-making processes. New set-ups have blended these two functions, but this has put new requirements on the data management processes.

The emergence of Data-as-a-Service as the preferred sourcing model

With the data management function covering more data sources and extending into analytics, self-service capabilities provide staff across departments with easy access to the data required for decision-making.

Therefore, data management solutions providers are shifting their service offering from software to a Data-as-a-Service (DaaS) model. This is where suppliers host and run the data management infrastructure while also performing checks on the data and doing part of the curation work. Dashboards provide the end client full visibility into these processes.

DaaS can cover any data set including corporate actions, security master, issuer and pricing data, and can range from use-case specific offers to providing data services at an enterprise level. Data collection and curation typically includes the tracking of quality metrics, the cross-referencing and verification of different sources, the set-up of business rules for data quality and data derivation, and root-cause analysis on gaps and errors. Cleansed and fully prepared data then feeds into a customer’s operations, risk management, performance management and compliance.

On top of the curated data sets provided by the DaaS provider, customers’ quants and data analysts can then set up proprietary metrics to feed into decision-making.

DaaS solutions are a logical next step after managed services, which cover hosting and application management but typically do not include data curation. Firms see the value in having a one-stop shop and benefitting from the domain expertise of specialist providers that integrate data sources for a living. Firms stand to benefit from shortened change cycles, improved data quality, and transparency into the collection and verification processes. Combined with quality metrics on diverse data sets and sources, businesses will find their data operations vastly improved.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An update on data standards and global identifiers

Data standards and global identifiers have been parts of capital markets’ practices for many years, and more are being developed, reviewed and shaped as the industry acknowledges their role in streamlining data management, reducing risk, improving transparency, and achieving compliance. This webinar will discuss data standards and identifiers in play, as well as those in...

BLOG

Kensho Integrates Link AI with S&P Cross-Reference Capabilities to Improve Client Data Management

Kensho Technologies, an S&P Global company, has integrated its Link AI solution with S&P Global Market Intelligence’s Business Entity Cross Reference Service (BECRS) to streamline client data management. By combining Kensho Link, a machine learning service that maps entities in a user’s database to unique ID numbers from S&P Global’s company database, and BECRS datasets...

EVENT

TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...