About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Will SLA’s be Re-Evaluated After Tumultuous Times Highlight Response Issues?

Subscribe to our newsletter

Service level agreements were a key topic in this morning’s roundtable discussions at FIMA 2008, with one data manager at a Tier 1 financial institution suggesting that many SLA’s are now likely to be revisited in order to achieve better responses from their data suppliers after the current market conditions highlighted the need for faster answers to questions from the vendors.

SLAs between data vendors and their financial institution clients can become elaborate, but the more elaborate they get, the more it will cost to support, said a major vendor representative. When agreeing SLAs for offshored services, it is also essential to look at other factors such as time zones and turn around times on queries. But what is essential in crafting an SLA, is to focus on the key points of service that you would like to achieve, rather than trying to cover everything.

While vendors will not provide any guarantees on the accuracy of the data itself for a number of reasons, what they do provide is guarantees on the level of service they provide, in areas such as reacting to exceptions. So there is a certain level of responsiveness that is required – such as a response within an hour for up to 20 requests in the hour – to satisfy the SLA agreement.

The vendor/client SLA is usually a subset of SLAs that the client has with its own clients, said a buy side data manager in the discussion. When he is evaluating data products, the criteria are cost, coverage and service, with service receiving the largest weighting. But this is then pushed back by his company’s executives who put more emphasis on cost and coverage. So it’s necessary to find a balance between them among suppliers.

Interestingly, the major vendor said that analysing metrics over a long period of time, like 24 months to see which vendor is right or wrong on a piece of data, the average is between 48.5% to 51.5%. In other words, all vendors have a similar level of errors averaged out across market segments, sources or processes.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise and making it available and actionable to everyone who needs it is extremely difficult....

BLOG

GoldenSource OMNI Evolves as Buy-Side Demands Transform

Data cloud giant Snowflake’s forum in San Francisco last month was closely watched by the data management industry, especially GoldenSource. A year after its launch, the creators of GoldenSource’s OMNI data lake product for asset managers were keenly watching what Snowflake had to offer with an eye to enhancing the app’s own provisions for the...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...