The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Will SLA’s be Re-Evaluated After Tumultuous Times Highlight Response Issues?

Service level agreements were a key topic in this morning’s roundtable discussions at FIMA 2008, with one data manager at a Tier 1 financial institution suggesting that many SLA’s are now likely to be revisited in order to achieve better responses from their data suppliers after the current market conditions highlighted the need for faster answers to questions from the vendors.

SLAs between data vendors and their financial institution clients can become elaborate, but the more elaborate they get, the more it will cost to support, said a major vendor representative. When agreeing SLAs for offshored services, it is also essential to look at other factors such as time zones and turn around times on queries. But what is essential in crafting an SLA, is to focus on the key points of service that you would like to achieve, rather than trying to cover everything.

While vendors will not provide any guarantees on the accuracy of the data itself for a number of reasons, what they do provide is guarantees on the level of service they provide, in areas such as reacting to exceptions. So there is a certain level of responsiveness that is required – such as a response within an hour for up to 20 requests in the hour – to satisfy the SLA agreement.

The vendor/client SLA is usually a subset of SLAs that the client has with its own clients, said a buy side data manager in the discussion. When he is evaluating data products, the criteria are cost, coverage and service, with service receiving the largest weighting. But this is then pushed back by his company’s executives who put more emphasis on cost and coverage. So it’s necessary to find a balance between them among suppliers.

Interestingly, the major vendor said that analysing metrics over a long period of time, like 24 months to see which vendor is right or wrong on a piece of data, the average is between 48.5% to 51.5%. In other words, all vendors have a similar level of errors averaged out across market segments, sources or processes.

Related content

WEBINAR

Recorded Webinar: Data Standards – progress and case studies

Global data standards and identifiers are essential to business growth, market stability and cost reduction – but they can be challenging to implement, while a lack of consistency across jurisdictions has presented obstacles to global take-up. However, with regulators starting to sit up and take note, the issue of data standards is coming increasingly to...

BLOG

ANNA and GLEIF Expand ISIN-to-LEI Mapping Service

The Association of National Numbering Agencies (ANNA) has further expanded the ISIN-to-LEI mapping service it provides in conjunction with the Global Legal Entity Identifier Foundation (GLEIF) to cover ISINs in an additional group of jurisdictions. The mapping service developed collectively by the GLEIF, ANNA and its National Numbering Agencies (NNAs) was launched as a pilot...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual is a global online event that will be held in June 2021 with an exceptional guest speaker line up of RegTech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment.

GUIDE

Applications of Reference Data to the Middle Office

Increasing volumes and the complexity of reference data in the post-crisis environment have left the middle office struggling to meet the requirements of the current market order. Middle office functions must therefore be robust enough to be able to deal with the spectre of globalisation, an increase in the use of esoteric security types and...