The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit – Solving the Onshore or Offshore Conundrum

Christopher Bannocks, global head of reference data at Barclays Bank, detailed the pros and cons of offshoring and onshoring data management at last week’s A-Team Group Data Management Summit, concluding that value creation rather than cost arbitrage has become the decider on where work should be placed. He also fired a shot across the bows of so-called domain experts, whose expertise, he suggested, could be relocated to offshoring sites such as India in the medium term if not sooner.

With some 45% of his team offshore, Bannocks has heeded the advice of business management gurus Tom Peters and Peter Drucker to ‘do what you do best and outsource the rest’, although he is conscious that decisions are no longer that straightforward. He explained: “Business process outsourcing (BPO) is becoming a search for excellence. This is more important than the cost discussion of the past decade. Many firms have started outsourcing, but there is still a long way to go and nine out of ten feel they have not yet done enough.”

That said, Bannocks noted that whatever the extent of outsourcing, it is vital to pay constant attention to an organisation’s shape and size, considering not only the offshore element, but also the retained onshore component and its added value above the offshore element.

He went on to describe the exceptional team building and bonding he experienced at outsourcing suppliers on his first visit to Mumbai and covered the benefits of outsourcing, including a collocated team prepared to work 24/7 to cover global time zones and skilled to work on a variety of platforms. He suggested such teams can sometimes see opportunities for rationalising data management that are difficult to identify in organisations with many data silos and can deliver lower total cot of ownership.

Bannocks said the selection of offshore BPO depends on the size of an organisation and the tasks that must be achieved, but noted the benefits of capacity to take immediate action and a flexible supply model. He contrasted BPO with a captive offshore operation that takes longer to set up as premises must be found and teams hired and trained, but pointed here to a better connection with the bank, its aims, objectives and career paths. “A mix could be good, but the trade-off is between speed and inclusion,” he said.

Turning to the retained onshore organisation, Bannocks said this remains crucial as a central location for oversight and management, and as a business base that is close to customers and revenue.

He explained: “Retained organisation functions include customer service, analytics and quality, and oversight and regulation. It may be necessary to develop skills in these areas and it may be necessary to develop data management skills in cases where customer data must be kept onshore, but data analytics are often better offshore, so a mix of data management may be required. Maintaining proximity to market means better intelligence gathering and better relationships with vendor partners.”

Addressing subject matter expertise, Bannocks said onshore expertise could diminish as offshore expertise grows, and set out the best positioning for a bank’s expertise, saying: “Generic data expertise needs to be offshore; expertise in business processes and internal systems needs to move offshore within 18 months; and market expertise needs to stay onshore.”

Looking forward, Bannocks concluded: “We are in the financial services and information management industries. Over time, as offshoring increases cost arbitrage will decrease, so the need is to keep training and educating to sustain value offshore. We need to think a decade into the future about how the choices we make today will affect our teams in ten years’ time. We want them to be proud data professionals.”


Related content


Recorded Webinar: Implications of MiFID II for Data Management

This webinar has passed, but you can view the recording by registering here. MiFID II is shaping up to be one of the biggest regulations to impact the financial industry. While much of the regulation focuses on trading, there are many implications for data management. The webinar will explore issues effecting data management, including extended...


GoldenSource Ushers Reference and Pricing Data into the Front Office with Quant Workbench

Extracting value from data is a priority for financial institutions as the business looks to increase efficiency, reduce costs, identify new opportunities and gain competitive advantage. Some source in-house tools to improve the quality and accessibility of internal and external data, others look to third-parties for solutions. A new tool from GoldenSource, Quant Workbench, brings...


Data Management Summit London

The Data Management Summit Virtual explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue enhancing opportunities. We’ll be putting the business lens on data and deep diving into the data management capabilities needed to deliver on business outcomes.


Directory of MiFID II Electronic Trading Venues 2018

The inaugural edition of A-Team Group’s Directory of MiFID II Electronic Trading Venues 2018 offers a guide to the European landscape resulting from new market structure introduced by the January 3, 2018 implementation of Markets in Financial Instruments Directive II (MiFID II). The directory provides detailed profiles of more than 70 venue operators and their...