About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Will SLA’s be Re-Evaluated After Tumultuous Times Highlight Response Issues?

Subscribe to our newsletter

Service level agreements were a key topic in this morning’s roundtable discussions at FIMA 2008, with one data manager at a Tier 1 financial institution suggesting that many SLA’s are now likely to be revisited in order to achieve better responses from their data suppliers after the current market conditions highlighted the need for faster answers to questions from the vendors.

SLAs between data vendors and their financial institution clients can become elaborate, but the more elaborate they get, the more it will cost to support, said a major vendor representative. When agreeing SLAs for offshored services, it is also essential to look at other factors such as time zones and turn around times on queries. But what is essential in crafting an SLA, is to focus on the key points of service that you would like to achieve, rather than trying to cover everything.

While vendors will not provide any guarantees on the accuracy of the data itself for a number of reasons, what they do provide is guarantees on the level of service they provide, in areas such as reacting to exceptions. So there is a certain level of responsiveness that is required – such as a response within an hour for up to 20 requests in the hour – to satisfy the SLA agreement.

The vendor/client SLA is usually a subset of SLAs that the client has with its own clients, said a buy side data manager in the discussion. When he is evaluating data products, the criteria are cost, coverage and service, with service receiving the largest weighting. But this is then pushed back by his company’s executives who put more emphasis on cost and coverage. So it’s necessary to find a balance between them among suppliers.

Interestingly, the major vendor said that analysing metrics over a long period of time, like 24 months to see which vendor is right or wrong on a piece of data, the average is between 48.5% to 51.5%. In other words, all vendors have a similar level of errors averaged out across market segments, sources or processes.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An Agile Approach to Investment Management Platforms for Private Markets and the Total Portfolio View

Data and operations professionals at private market institutions face significant data and analytical challenges managing private assets data. With investors clamouring for advice and analysis of private markets in their search for returns, investment managers are looking at ways to gain a more meaningful view of risk and performance across all asset types held by...

BLOG

AI is Helping to Solve New ESG Data Challenges: ESG Briefing Review

The peculiar demands that ESG data integration places on capital markets participants requires powerful techniques that are increasingly being provided through artificial intelligence, A-Team Group’s recent ESG Data and Tech Briefing London heard. From data quality monitoring and analytics to supply chain analysis and investment management, AI-based tools are already offering automated solutions to some...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

What the Global Legal Entity Identifier (LEI) Will Mean for Your Firm

It’s hard to believe that as early as the 2009 Group of 20 summit in Pittsburgh the industry had recognised the need for greater transparency as part of a wider package of reforms aimed at mitigating the systemic risk posed by the OTC derivatives market. That realisation ultimately led to the Dodd Frank Act, and...