About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Will SLA’s be Re-Evaluated After Tumultuous Times Highlight Response Issues?

Subscribe to our newsletter

Service level agreements were a key topic in this morning’s roundtable discussions at FIMA 2008, with one data manager at a Tier 1 financial institution suggesting that many SLA’s are now likely to be revisited in order to achieve better responses from their data suppliers after the current market conditions highlighted the need for faster answers to questions from the vendors.

SLAs between data vendors and their financial institution clients can become elaborate, but the more elaborate they get, the more it will cost to support, said a major vendor representative. When agreeing SLAs for offshored services, it is also essential to look at other factors such as time zones and turn around times on queries. But what is essential in crafting an SLA, is to focus on the key points of service that you would like to achieve, rather than trying to cover everything.

While vendors will not provide any guarantees on the accuracy of the data itself for a number of reasons, what they do provide is guarantees on the level of service they provide, in areas such as reacting to exceptions. So there is a certain level of responsiveness that is required – such as a response within an hour for up to 20 requests in the hour – to satisfy the SLA agreement.

The vendor/client SLA is usually a subset of SLAs that the client has with its own clients, said a buy side data manager in the discussion. When he is evaluating data products, the criteria are cost, coverage and service, with service receiving the largest weighting. But this is then pushed back by his company’s executives who put more emphasis on cost and coverage. So it’s necessary to find a balance between them among suppliers.

Interestingly, the major vendor said that analysing metrics over a long period of time, like 24 months to see which vendor is right or wrong on a piece of data, the average is between 48.5% to 51.5%. In other words, all vendors have a similar level of errors averaged out across market segments, sources or processes.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers being used increasingly for the benefit of the business. This webinar will survey the landscape of...

BLOG

Alveo and Gresham Merge to Offer Data Services at ‘Significant’ Scale

Data management software and services providers Alveo and Gresham Technologies have merged in a deal that the newly augmented company says will offer clients data automation and optimisation at “significant” scale. The new business, which will be known as Gresham, will be based in London with former Gresham Technologies chief executive Ian Manocha continuing the...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Regulatory Data Handbook – Fifth Edition

In response to the popularity of the A-Team Regulatory Data Handbook, we have published a fifth edition outlining the essentials of regulations that are likely to have an impact on data and data management at your organisation. New to this edition is a section on RegTech, covering drivers behind the development of innovative regulatory technology,...