About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Calling All Data Feeds…

Subscribe to our newsletter

Dutch data management specialist Asset Control has ambitious plans to integrate the complete set of reference datafeeds into a single industry utility, chief executive Ger Rosenkamp explains.

When Asset Control announced at the end of last month that it had mapped and normalised Telekurs Financial’s Valordata Feed (VDF) to its AC Plus system and ACDEX service (see RDR, Issue 03), there were a few raised eyebrows among both data vendors and software developers.

The fact that it also announced that it plans to do the same for the rest of the datafeeds out there raised even more. “Two years from now we will have maybe twenty feeds normalised,” says Ger Rosenkamp, chief executive of the Netherlands-based company. “One customer can say they want to take data from say, feeds three, four and five, and another customer might want feeds 10, 16 and 20. They can make their own combination, reflecting the geographical and market coverage of the vendor feeds.”

Among the feeds that Rosenkamp says the company will be working to integrate are Bloomberg, Reuters, FT Interactive Data, Telekurs Financial, Thomson Financial, Standard & Poor’s JJ Kenny’s, and various other S&P services.

But Rosenkamp says that the scale of this work is not as important as how and why they’re doing it. “The announcement about what we’ve done with Telekurs’ VDF is not just about VDF, it’s about the methodology and the emerging data standards,” he says. “What we’ve been doing with VDF is integrating it to the emerging data standards, like MDDL and ISO15022. We work with all the standards bodies and participate in their committees. Once standards are accepted both by the vendors and the downstream applications, the interfacing will be a piece of cake. They will talk seamlessly to each other.”

Rosenkamp might be thought to have something against the information vendors: in one way or another, he has been working to erode their grip on customers’ wallets for two decades now. Asset Control can, with some justification, claim to have been one of the first companies to address the issues of what we now call reference data, and Rosenkamp sees parallels with the situation in the real-time market data industry in the 1980s and 1990s. “Twenty years ago the data vendors and application vendors all had proprietary data standards: it was not an open market at all,” he says. “When I started Stock Data one of the key success factors was to create conversion tools that allowed users to retrieve the data only once but to use it multiple times. We recognised that we had customers that were buying the same data 50 times.”

This was carried into Asset Control when it started in 1991. “I thought that there was an enormous opportunity to create data transparency but there was a lot of resistance. The market is changing, and some data vendors have changed their attitude, but I can’t say that is the case with all the vendors.”

The current push towards standards through groups like the Reference Data User Group and Redac, along with the Market Data Delivery Language (MDDL) and ISO15022, gives him hope. “The absence of data standards is still an enormous problem, but ultimately every vendor will be forced to accept standards, whether they like it or not. They will have no choice.” He likens the situation to that which led to the development of user groups such as the Information Providers Users Group (IPUG) as an effective voice against companies like Reuters and Bloomberg.

But while there are similarities in the structure and politics – and, indeed, many of the same players – between the real-time and reference data industries there are a lot of important differences. For one thing the market is more fragmented, lacking the large dominant players across markets and territories, but this is why data management is such an issue. “There are not many, if any, feeds that have all the data customers want,” says Rosenkamp.

“There are two reasons to look at a centralised data management systems like ours: one is to centralise everything to compare data sources for data quality, and get rid of redundancy. The other reason is to integrate the data from various vendors to have a complete data horizon.”

The comparison of various data sources for quality reasons is central. “A few years ago it was risk management that created a market for quality data, then it was STP and T+1, that had people screaming for quality data,” he says. “With risk you are talking about market data which is not that difficult to cleanse because you can compare neighbouring data – if you have a row of tens and then suddenly a hundred, it’s clear that the decimal point is in the wrong place– but there are more complex issues with other data. Dividends, for example: if you have a dividend of £1, how do you know it’ not supposed to be £10? You can compare it with historical values, but the best way is to compare the value of several vendors. You can create all kinds of business rules to compare the data, and that increases the reliability enormously.”

Cross-Vendor Data Model

In mapping the rest of the datafeeds on the list, Asset Control will take the same approach as it has with Telekurs. “We can hold several data models in parallel to each other,” says Rosenkamp. “We retrieve the data from the vendor, store it according to their data model, then map it to the industry standards and transfer it to a normalised data model, which is a cross-vendor data model. The advantage of this is that you avoid mapping one vendor to another, which creates spaghetti and creates unsolvable problems.”

This applies to both AC Plus, a stand-alone system for deployment at customer sites, and ACDEX, a centralised service that Asset Control has grand ambitions for. “With ACDEX, the purpose was to create an industry utility establishing data transparency,” says Rosenkamp. AC Plus is technically identical to ACDEX. “We have become a customer of our own technology. Both systems use the same data models,” he says.

Originally, users were not keen on the idea of the service. “When we first talked to our customers about ACDEX at the beginning of last year they said ‘nice idea, but it won’t work’,” he says. “They didn’t want to outsource their data management, because they make money from certain aspects of their data management.”

The solution is something of a compromise, but one that he feels addresses the needs of customers. “What you have is an outsourced utility that consolidates all the commoditised data – that is all the data that they buy from external sources,” he says. “If they can take the Golden Copy that we make in ACDEX and consolidate it with their internal systems, then you have a combination that is really valid.”

The creation of a central utility model in ACDEX along with the mapping of all of the relevant data is clearly a major undertaking, and it is one that not everyone wants to see the company succeed in. “Obviously, not all the data vendors are in favour of this, because they still have proprietary formats – for certain vendors it takes over a year to interface their feeds for customers,” he says. “For us, the customer has always been king, and we don’t want to be ruled by the data vendors.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

Financial Institutions Turn to AI and Cloud to Solve Data Challenges

Financial institutions are undergoing digital transformations that are seeing them harness the full potential of the huge volumes of data they generate. Importantly, to do that they are deploying the technologies that are already defining our era – artificial intelligence (AI) and the cloud. From streamlining back-office processes to informing front-office decision making, and from...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...