As first revealed by Reference Data Review in May last year, the Dubai International Financial Centre’s (DIFC) DClear subsidiary has been working on a reference data utility for some time. DClear Utilities, which is now a part of DIFC owned SmartStream Technologies, has now officially been launched on the market and has just bagged its first client, explains John Mason, newly appointed CEO of the vendor’s utility division.
DIFC, if you remember, bought back office solutions vendor SmartStream in November 2007, and as a result has been using the vendor’s expertise to speed its progress in the utility building endeavour. To this end, it enlisted Mason, who was previously SmartStream’s UK regional director, to head DClear Utilities.
Previous DClear CEO, Philippe Chambadal, was appointed head of SmartStream earlier this year, but the utilities focused operation has retained David Penney as new products director and chief technology officer. The DClear Utilities division is aiming to provide shared data and trade processing utilities for the financial services community, according to Mason.
This will be achieved via the centralised processing of financial instruments across their entire lifecycle through an on demand style platform, which will act as a shared service centre for common processing services. The vendor claims that by consolidating functions such as confirmations matching, order/trade matching and position management into a single entity, it will deliver greater transaction visibility, be able to lower operational risk and reducing a firm’s cost per trade.
The vendor already has its first client and is working on building the utility to meet its needs, says Mason. It is due to go live with this client in September this year and Mason is hopeful that more clients will come on board before the end of the fourth quarter.
“The traditional approach in the reference data space is to have a single vendor-client relationship but we are looking at a network style model instead. By dealing with a single client’s reference data issues it doesn’t effectively help them to eliminate trade breaks caused by other counterparties’ faulty reference data,” explains Mason.
The DClear model is therefore based around the shared, standardised delivery of commoditised functions for back office processing and a centralised repository of standardised data, which the vendor reckons will deliver significant risk mitigation and reduce operational cost.
“In the current financial market of decreasing margins and volume volatility, the need to reduce transaction costs through more efficient back office processes that reduce trade breaks and increase risk controls has become even more urgent. We see that pressure is driving the trend towards the standardisation of systems and processes that offer no strategic differentiation yet consume vast amounts of time, effort and money,” continues Mason.
Chambadal, who has long been a proponent of this project, adds: “DClear’s industry Utilities can create a network effect by demonstrating to all market participants that using the same reference data and reconciliation and exception management processes will significantly reduce risk and cost. By taking a co-operative approach among market participants to solve a common problem these utilities deliver the economies of scale that benefit all participants, initially for reference data and then for more generic back office processing.”
The utility will deal with reference data beyond just issuer information, he says, including securities identifiers, legal entity, client, counterparty and agent-broker data. The vendor’s first client, who has not been directly named but is based in the US, has asked the vendor to deal with the aggregation, reconciliation and monitoring of its full securities master data, explains Mason.
“We are working on cross referencing all this securities master data and providing a full data dictionary of information such as exchange feeds data.” The vendor is dealing with this data at the individual data field level and providing integration and cleansing of data sources, with rules-based processing providing integrity checks on all incoming data. As a result the customer is hopeful that the utility will ensure fewer trade breaks by providing clean and consistent data, integrated across all assets, entities and instructions.
According to Mason, the work to meet the first client’s requirements has been fairly rapid and has initially been focused on redirecting the data feeds via SmartStream’s system to clean up the data. The vendor began work in early July with the client and full rollout is expected in mid-September.
“The speed of implementation depends on the scope of the project – it can be quick but it may take longer if more areas of reference data are involved. It will also depend to some extent on downstream systems integration challenges,” he adds.
Rather than providing clients with new reference data codes, SmartStream is essentially cross referencing existing data formats so that they can ‘talk’ to each other and be reconciled. It is generating its own internal identifiers via which to cross reference data but is not expecting clients to adapt their systems to its own codes. The vendor is therefore not seeking to compete with data providers such as Interactive Data or SIX Telekurs on the data feeds side but is instead aiming to append and enrich these feeds, says Mason.
However, SIX Telekurs has its own reference data utility initiative in the Swiss market and is seemingly keen to use this model in other regions, thus putting it into direct competition with SmartStream. Mason is unfazed by this competition: “SIX Telekurs is a data vendor and will therefore struggle to provide the overall data set that the market requires. A more independent approach to data feeds is required.”
Regular readers of Reference Data Review will also recognise that the idea of a utility in the reference data space has proved controversial over recent months due to the European Central Bank’s (ECB) proposals on the subject. Mason is not dismissive of the idea but thinks it needs to be put in historical context: “The ECB’s proposals represent an admirable goal for the industry. We have met with Francis Gross, head of the statistics division at the ECB, to discuss their vision. They are seeking to understand the various identifiers in the market and seem to be looking to generate new codes rather than go the route of cross referencing to begin with. The industry has learnt in the past that such ambitious projects can quickly lose steam if the industry does not get on board from the start.”
He is confident that the SmartStream approach of taking the project one step at a time will pay off in the end. The vendor is initially looking to its current user base as potential clients for the utility, across both the buy side and the sell side. The focus is to get as many users on board as possible to achieve a tipping point for data standardisation. The vendor is also looking at the pricing of the service at the moment and debating the pros and cons of a subscription model or pricing based on the number of reference data items involved in the project, adds Mason.
Over the last couple of months, a number of Reference Data Review readers have been compelled to write in to voice their concerns about a regulatory driven initiative in the utility space, but what of a vendor led initiative such as SmartStream’s? Would a competitively driven endeavour be more likely to succeed? Are readers keen to see such an offering on the market? Let us know your views by dropping us a line.