About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Will SLA’s be Re-Evaluated After Tumultuous Times Highlight Response Issues?

Subscribe to our newsletter

Service level agreements were a key topic in this morning’s roundtable discussions at FIMA 2008, with one data manager at a Tier 1 financial institution suggesting that many SLA’s are now likely to be revisited in order to achieve better responses from their data suppliers after the current market conditions highlighted the need for faster answers to questions from the vendors.

SLAs between data vendors and their financial institution clients can become elaborate, but the more elaborate they get, the more it will cost to support, said a major vendor representative. When agreeing SLAs for offshored services, it is also essential to look at other factors such as time zones and turn around times on queries. But what is essential in crafting an SLA, is to focus on the key points of service that you would like to achieve, rather than trying to cover everything.

While vendors will not provide any guarantees on the accuracy of the data itself for a number of reasons, what they do provide is guarantees on the level of service they provide, in areas such as reacting to exceptions. So there is a certain level of responsiveness that is required – such as a response within an hour for up to 20 requests in the hour – to satisfy the SLA agreement.

The vendor/client SLA is usually a subset of SLAs that the client has with its own clients, said a buy side data manager in the discussion. When he is evaluating data products, the criteria are cost, coverage and service, with service receiving the largest weighting. But this is then pushed back by his company’s executives who put more emphasis on cost and coverage. So it’s necessary to find a balance between them among suppliers.

Interestingly, the major vendor said that analysing metrics over a long period of time, like 24 months to see which vendor is right or wrong on a piece of data, the average is between 48.5% to 51.5%. In other words, all vendors have a similar level of errors averaged out across market segments, sources or processes.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

Navigating the Complex New Sanctions Landscape: Webinar Preview

The criticality of sanctions to the armoury of international relations has been amplified over the past decade as geopolitical and trade tensions have intensified. Since Russia’s annexation of Crimea in 2014 and its attempted full-scale invasion of Ukraine in 2022, governments around the world have increased sanctions on nations and entities by 700%, according to...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...