About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Keeping a Team Onshore to Monitor Data Quality is Key, Agrees Offshoring and Outsourcing Panel

Subscribe to our newsletter

Institutions should keep a small team of data experts on board in order to adequately monitor that service level agreements are being met by outsourcing providers and data requirements are being catered to by offshore teams, agreed FIMA 2008’s panel on offshoring and outsourcing.

Susan Outzen, business relationship manager in the enterprise data management (EDM) division at HSBC, talked about her previous experience at UBS, where the bank kept a small team onshore to deal with high risk data while they outsourced the low risk data. “It was good to have that expertise remaining in-house to monitor what was going on and reacting to any problems that the outsourced unit experienced,” she explained. HSBC has decided to take the offshoring approach to deal with capacity issues and increase their data capacity across the organisation, Outzen continued. “It was not primarily driven by cost savings because these were only around 25-30%,” she said.

Predrag Dizdarevic, president of KonsultLab and chair of the panel, recommended finding a good onshore team to control the outsourcing or offshoring relationship. Jean Pierre Gottdiener, independent consultant, seconded this notion and urged firms to hire an “available and aware” team. “The staff needs to be aware of the issues that may crop up and the service level agreement must be very clear,” he added.

Dizdarevic listed some of the potential pitfalls within outsourcing agreements for the delegation and these included issues such as liability when a vendor makes a mistake, complications in the internal delivery of data and meeting certain legal requirements. He said that pricing could also be an issue: “Don’t expect these vendors to offer a commoditised price because this business is in the early stages of development and they are likely to charge higher one off project rates.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Introducing Market & Alt Data Insight: Advancing the Industrialisation of Data in Financial Markets

Financial markets are entering a new phase in the evolution of data. Data has always underpinned trading and investment workflows. What has changed is the scale, diversity and strategic management of that data across the enterprise. Traditional market data, alternative signals, derived datasets and AI-generated features now sit on the same operational continuum. The strategic...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Complex Event Processing

Over the past couple of years, Complex Event Processing has emerged as a hot technology for the financial markets, and its flexibility has been leveraged in applications as diverse as market data cleansing, to algorithmic trading, to compliance monitoring, to risk management. CEP is a solution to many problems, which is one reason why the...