The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

The Interview: Full Circle for EDI

Share article

Nearly a decade ago, a licence to distribute London Stock Exchange started Exchange Data International down the data aggregation route. RDR reports on how far they’ve come

A little over two years ago, the phrase reference data was pretty much unknown. Today it is in danger of becoming one of those tick-box capabilities that suppliers claim to have because they’ve heard that customers are buying it.

That might be putting it a bit strongly, but it is one of the factors that both irritates and amuses managing director Jonathan Bloch, and his colleagues at Exchange Data International. “There is something of a vogue for it at the moment,” says Bloch. “It seems to cover a lot of ground, but I’m not convinced that there’s a lot of reality behind some of the things being claimed.”

Bloch’s essential point is that much of what is being grouped under the reference data umbrella is little more than good honest implementation and deployment of technology upgrades in areas that have been out of the spotlight for a long time.

“What people are not prepared to do at the moment is spend vast amounts of money doing a wholesale cleaning of their databases,” he says. “What they are willing to do is take feeds, such as to update name changes.… That’s where we are seeing lots of demand in corporate actions and dividend information.”

Kevin Brady, EDI’s sales director, expands on this: “People are trying to clean the back-end of their databases, which is typically corporate action and security master file data. What we haven’t really seen is the integration of the front, middle and back that everyone is talking about.”

Brady questions the viability of taking the brute force approach to rewriting back-end databases. “That golden copy approach is still questionable: what is a golden copy? If two sources say that it’s right and they’re both wrong? That gets interesting,” he says. “To define a securities master file you need to go back to the parent entity. People come to us and say that they want to clean up their data – they want to know the country of incorporation, the legal entity that they are dealing with and the rest … there are no defined standards for that, so you have to go and research it. We’ve done that for a number of clients.”

The gathering and aggregation of what was called static data has always been the role of EDI, from a basic product set in the beginning that has been expanded as technology developed to what Bloch hopes will be an international contender in a year’s time.

“Basically, we started in June 1994 and took a licence from the London Stock Exchange to take their print products – which was essentially Stock Situation Notices – and made them electronic, and also take their code book and build an electronic product, which was in those pre-Internet days a bulletin board,” says Bloch.

“Very soon thereafter people came and said that they liked what we were doing, but that they wanted us to cover the world, not just the U.K. We wondered how we were going to pick up all those data sets. What we decided on was a sort of brick-by-brick approach: one of the first products was a depositary receipts database, then we expanded to shares outstanding, worldwide dividends and then worldwide corporate actions.”

The idea was to cover every single equity, but there were various abortive starts with the worldwide corporate actions data. “The simple reason was that the technology wasn’t there to do that efficient cheap data gathering,” says Bloch. What changed that was the introduction of the File Transfer Protocol.

“You didn’t have to rely on papers being delivered, tapes arriving and so on, you just FTP’d it across, and that really streamlined the business. Also, at the same time, the Internet meant that the technology for the client became a lot simpler. If you were setting up a bulletin board, you almost invariably had to go to the client site to install it, or alternatively send them a CD.”

There were other issues with the bulletin board approach in those days that anyone younger than their early 30s will find hard to credit, such as the fact that they didn’t work very well through switchboards, so there had to be a dedicated phone line.

“All of which meant that there were huge barriers, technologically, to easily implementing a system,” says Bloch. “With the Internet, it was simple – you can be anywhere in the world and get on. That combination of FTP and what’s now called ASP, has revolutionised the business. It also meant that people could take feeds, because there were no bandwidth issues, there was lots of data.”

For vendors, the result was that it became simpler to both gather and disseminate data in disparate geographical locations.

For EDI, this had a direct effect on the company, in that it now does 50% of its business outside the U.K., predominantly in the U.S.

With the technology in place, the current main priority of the company is to continue to expand the data set. “We have two projects on the go at the moment,” says Bloch. “One is for closing prices, where we again hope to cover all equity markets. The second is an expansion of our corporate actions service to cover fixed income. This year we have already added corporate actions for warrants, which was a departure for us because we moved beyond the equities area. We’ve had a team of consultants working for several months now on the specification and obtaining the data sources.”

The data gathering side is to some extent an extended series of partnerships and relationships. “We have over 300 sources of data, and when we have finished with the fixed-income side, I reckon we’ll have over 1,000,” says Bloch.

“One of the things that we have done is to hone down our skills in the aggregation of data and in data management, which allows us to operate with considerably fewer staff than our competitors, who have issues with legacy products and hardware. People are surprised when they see our operations, but we had automated as much as possible right from the beginning. For us, the technology was the great enabler.”

The data operations are based around researchers in the U.K., with sites in London and Horsham, and in Mumbai, India, where EDI has had some 25 analysts for the past three years who field the data, check the security coding and FTP it back to London for dissemination to clients. The Indian staff will be expanded to handle the growth in fixed-income data.

“To a great extent the market is driving this,” says Brady. “We engage in a consultative discussion with our clients to work with them instead of just handing over a file of data. That’s very important to us – we have to face the reality that we’re not exactly the largest information vendor in the world, so we have to find unique ways of forming relationships with major institutions. Within our 220 international clients, we have those relationships, in one way or another, with most of the Blue Chips and many of the other institutions, and we’re looking to expand that.”

He admits that there are some obstacles to be overcome in doing that. “In some ways, we are the new player on the block: traditionally there are FT Interactive Data and Telekurs, and increasingly there are Reuters and, to some extent, Bloomberg,” says Brady.

“People either have one or two of those, and we are an alternative – both in terms of quality data and, the bottom line, price. People have been spending a lot of money on market data that they are actually integrating into legacy systems. In that scenario, we can provide a quality alternative at a lower price, and we are going on the marketing offensive with that message.”

With the end of the company’s first decade coming up, Bloch is very clear that it will be in a position to celebrate: “By the end of the first or second quarter next year, we will be a serious player in the back-office/middle-office worldwide corporate actions area for fixed income, warrants and equities, closing prices for equities, and the securities master file based on Sedol.”

Related content


Recorded Webinar: The Benefits of Advancing Client Data Management Capabilities

Managing client information is not a new challenge for financial institutions, and it is a challenge that continues to evolve. Bringing together different silos of client information into a single, holistic and hierarchical view to understand client risk and meet KYC and AML obligations is a common issue for financial institutions. Traditionally, client information is stored...


Cross-Sector Data Platform Hopes to Revolutionise the Fight Against Financial Crime

Last week, a consortium of businesses, banks and the UK Financial Conduct Authority (FCA) announced a pioneering new AI-driven data access platform set to launch in 2021, led by RegTech firm RegulAItion. The AIR Platform promises to transform the way in which data is accessed by regulated industries, providing a digital infrastructure for scalable, automated...


Data Management Summit London

Now in its 10th year, the Data Management Summit (DMS) in London explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue enhancing opportunities. We’ll be putting the business lens on data and deep diving into the data management capabilities needed to deliver on business outcomes.


Entity Data Management Handbook – Sixth Edition

High-profile and punitive penalties handed out to large financial institutions for non-compliance with Anti-Money Laundering (AML) and Know Your Customer (KYC) regulations have catapulted entity data management up the business agenda. So, too, have industry and government reports on the staggering sums of money laundered on a global basis. Less apparent, but equally important, are...