About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Beyond Licence Fees: Key Considerations for Lowering TCO in Financial Data Management

Subscribe to our newsletter

By Neil Sandle, Chief Product Officer, Alveo.

New technologies can significantly reduce the total cost of ownership (TCO) for financial data management platforms, with considerations going beyond the vendor licence fee and encompassing the full scope of TCO.

Understanding your data management TCO requires evaluating key factors with solution providers, including the technology stack’s quality, disaster recovery costs, change management requirements, the vendor’s strategy for self-service versus specialised support needs, and product distribution capabilities.

The choice of technology stack – whether for a cloud-deployed solution (AWS, GCP, Azure) or a vendor-managed service – significantly influences costs that are often reflected in service charges. One might wonder about the relevance of technology in a managed service context. However, a client-server architecture, for example, can lead to inefficiencies. Vendors might over-provision resources to accommodate peak loads, like morning back-office file loads, leading to underutilisation and unnecessary costs for the rest of the day. Whether managing the cloud yourself or relying on a vendor, this approach results in higher operational costs, with these excess expenses typically passed on to you.

In the case of a client-server architecture, you cannot simply add more hardware resources to better run certain services. Instead, you must replace your current platform with another virtual server that can again significantly increase infrastructure costs.

Microservices architecture

Microservices architecture allows scaling as needed, reducing cloud and compute costs and making solutions more cost-effective. Another advantage of microservices architecture is that it seamlessly integrates with in-house applications, offering flexibility. This means that institutions don’t need to replace complete in-house-built solutions with a vendor’s client-server monolithic platform. Instead, they can choose to enhance the functionality of an in-house application by using vendor components to solve a particular problem.

For example, you want a component to acquire data. This may be available already from a vendor as a separate microservice and offered as a managed service with regular updates as data sources make changes to their feeds. Equally, you can use a vendor’s microservices components to address requirements including distribution, cross-referencing, quality reporting and consumption monitoring, and benefit from best-in-class capabilities without requiring a wholesale overhaul of your existing data management infrastructure. This way, you can augment your data management setup on a piecemeal basis to address specific pain points.

My point is that a stateless microservices architecture enables customers to use vendor-specific functionality within their own in-house build applications, an approach that is more cost effective, enables use of best-in-class components and delivers faster time to market and ROI.

Operational resilience

Disaster recovery, crucial for financial services, can be cost-intensive with client-server architectures requiring standby systems. This is not ideal for those looking for a lower TCO.

Microservices architectures enable customers to deploy a single instance, but through well-defined DevOps deployments with the use of multiple zones and cloud providers, institutions can ensure a fault-tolerant solution for less cost than a typical standby routine required by a client-server architecture.

Another factor that is generally overlooked is the use of open-source technology and the impact it can have on financial data management. This is an important part of any TCO decision. The use of open-source technology, specifically data storage can impact the TCO of financial data management platforms dramatically.

Open source solutions

Open source data storage offers cost savings and performance, although the level of support provided varies. Cloud providers such as GCP, AWS or Azure offer services to help support clients running these technologies. For example, AWS Keyspaces for Cassandra or AWS RDS for PostgreSQL.

These services can significantly reduce TCO as you will not need to hire internal staff to run them. Check with your data management vendor that they offer tier-two support for any underlying open source technology specifically around their product.

Lowering the cost of change

Data management processes in financial organisations are always in flight and you cannot fight the tide of change coming from the business. Therefore, it is important to consider that change processes within an organisation can not only impact the effectiveness of the business to deliver, but also its costs.

In most instances a change process can be the largest component of TCO if you make the wrong vendor decision. To ensure you do not fall into this trap ask the following questions to any prospective data management solution provider:

  • What are the underlying technology and data storage methods and how easy it is to update the schema in the underlying database?

Where possible avoid a normalised data structure as it adds significant overhead when changing the schema. Furthermore, it requires the vendor to be involved when extending custom relationships and they will typically charge you for this.

  • Do you manage the data model centrally and can you extend the data model without asking the vendor?

Not controlling your own data model could significantly impact your ability to deliver to the business in a timely manner.

  • Can the vendor extend the data model with new attributes but not impact the customisations?

Typically, you may need to add some fields of your own, such as internal IDs or taxonomies or other fields used for downstream processing. It is important to select a vendor that can deliver maintenance updates without impacting customisations otherwise you will incur a high testing overhead within your organisation.

  • Does the vendor offer standard support for the integration of data sources and managed feeds whereby the delivery of regular updates is included out of the box?

A managed service with respect to data feeds can significantly reduce your TCO. Market data vendors regularly make changes to their feeds, and this can be a full-time job so put the burden on the vendor and it will reduce your ongoing TCO and ensure better accuracy as the vendor specialises in doing this task.

  • Does the vendor offer an out-of-the-box data model for normalisation and golden copy creation?

An out-of-the-box model for golden copy creation can significantly reduce the implementation lead time and overall cost.

  • When extending a data model for the business, do data attribute changes automatically become available for distribution?

When selecting a solution, it is important to get the vendor to show you how easily data model changes pass through to data distribution. In my view, it should be automated and be available in technologies like Kafka automatically, or easy to extend through the user experience for bulk delivery files.

Aspects to include when selecting a data management solution

The trend we are seeing in buy-side and sell-side organisations is a rise in adoption of technologies such as Kafka to distribute market data. Adopting technologies like this enables customers to focus on adopting a self-serve model. Enabling consumers of data to request their universe of interest and attributes revolutionises an organisation’s ability to adapt to the needs of its internal customers.

However, this concept, though revolutionary for most, creates challenges in some instances in data management platforms. In my experience, you need to focus on these important questions:

  • How seamless and timely is it for a data management platform to distribute datasets to new users and feeds?
  • What is the timeliness of adding new attributes to a feed?

The answer to these questions may surprise you. However, it is important to understand that distribution, whether through technologies like Kafka or fixed schema flat files, can be a significant proportion of TCO and more importantly a self-serve platform can really transform your business users’ use of financial data.

When selecting a financial data management vendor, it is important to get answers to these questions. In my view, these are the hidden costs of any TCO and you only find out about them once you are managing the platform. These costs have an impact whether you are hosting or using the vendor’s managed service.

In conclusion, choosing a financial data management platform transcends licence fee assessment. Delving into the TCO requires scrutiny of factors like technology stack, disaster recovery, open source usage, and change management processes. It is important to note the impact of underlying technology on costs, particularly in a hosted, but also a managed service scenario, as this will simply be a cost passed on to you.

The benefits of a microservices architecture can be overlooked in a decision-making process, but they are significant in the long term in running costs and testing, ensuring better business user satisfaction, and – simply put – enabling you to get the most out of your data.

Subscribe to our newsletter

Related content


Upcoming Webinar: ESG data sourcing and management to meet your ESG strategy, objectives and timeline

Date: 11 June 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes ESG data plays a key role in research, fund product development, fund selection, asset selection, performance tracking, and client and regulatory reporting, yet it is not always easy to source and manage in a complete, transparent and timely manner....


Northern Trust Integrates FINBOURNE Technology with Data Mesh Digital Backbone

Northern Trust, a large asset servicer, has selected FINBOURNE Technology to provide enhanced valuations and reporting capabilities for its enterprise global technology. The Chicago-headquartered firm ran a thorough technology partner selection process before deciding to implement FINBOURNE’s cloud-native financial data management solution LUSID and data virtualisation engine Luminesce to modernise its valuations and reporting functions...


RegTech Summit New York

Now in its 8th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.


Putting the LEI into Practice

Hundreds of thousands of pre-Legal Entity Identifiers (LEIs) have been issued by pre-Local Operating Units (LOUs) in the Global LEI System (GLEIS), and the standard entity identifier has been mandated for use by regulators in both the US and Europe. As more pre-LEIs are issued ahead of the establishment of the global systems’ Central Operating...