About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

SmartStream’s DClear Utilities Launches On Demand Data Dictionary Portal

Subscribe to our newsletter

As part of its wider bid to take on the reference data utility space, SmartStream Technologies has this week launched an on demand portal for access to verified and cleansed exchange and data provider documentation. John Mason, CEO of DClear, explains to Reference Data Review that although the Data Dictionary will form an integral part of the vendor’s overall Reference Data Utility, it is now available as a standalone structured information portal solution for those seeking to standardise, quantify and accelerate the translation of documentation.

SmartStream’s DClear business has been working on the solution for around six months, explains Mason, who was appointed as CEO of the vendor’s utility division back in September last year. According to DClear, the new offering simplifies the process of integrating new data providers and feeds into the client’s environment and improves data quality by reviewing, researching and verifying vendor feed documentation to ensure complete accuracy.

It was not the length of time that took to develop the solution that matters, notes Mason, as the Data Dictionary will be an ongoing project and the vendor will continually add to its content. “The basic premise is an online portal where data managers and developers can go to be made aware of the feed definitions from providers in the market, be they exchanges or data vendors. At the moment that is done by manually searching for and sourcing PDFs and documents for this data,” he explains.

The solution will therefore act as a portal where that information is available as a reference and will allow firms to begin cross referencing identifiers from the feeds of various providers. “For example, one provider may call a certain feed a best bid offer, while another might call it best offer. It enables you to say that Thomson Reuters calls it A, while Interactive Data calls it C, so that you have some commonality across your providers,” elaborates Mason. Ultimately, this will allow firms to map this feed data to their internal company data, he adds.

DClear indicates that the information will be provided in logical categories and users will not be required to do any front end integration work, as the data is viewable through a web interface. The web front end has become increasingly popular of late within the vendor community, as these providers seek to offer the lowest total cost of ownership for their solutions.

There are currently six tier one sell side institutions trialling the solution at the moment, says Mason, and he is hopeful that the new on demand approach will prove popular in the current environment.

Mason reckons the solution should provide real savings overall as it removes the need to build and maintain an in-house data dictionary and the related staffing costs therein. The DClear offering also removes the operational overheads of the cost of upkeep in collating, verifying and maintaining these data definitions across providers. “DClear’s utility approach removes this cost and uncertainty around staffing by ensuring definition information is always available, with clients only paying when they need to use the information. As a result, firms not only receive better quality data and achieve their global coverage requirements but do so at a fraction of the cost compared to managing it internally,” he claims.

He notes that it is in keeping with the current trend towards a “cloud approach” to data access, by enabling a quicker and easier way to store and retrieve data. “You can access it by paying for an hour’s access on your credit card,” explains Mason. “This is a real, true on demand approach to this space.” However, the vendor also offers a “more substantive” subscription service for those that wish to receive cross referencing and mapping to internal systems for this data.

As well as vendor and exchange data feeds, the solution will also include mapping to the work of industry bodies such as the EDM Council’s semantics repository. “That is one of the areas we are looking at to improve cross referencing capabilities,” says Mason. “Technical terms to business terms and in-house terms to business terms.”

Cross referencing has become increasingly important as the world has become more fragmented, he contends. There are multiple identifiers across the market and if people want to get a full picture of where liquidity resides, they need to see how items have been defined.

The solution fits into the vendor’s overall utility model, which Reference Data Review first exclusively covered back in May 2008, when it was part of the Dubai International Financial Centre (DIFC) and before the DIFC’s acquisition of SmartStream. “We have a number of tools out there including a cross referencing symbology tool and this data dictionary that takes us into another layer, which means we are able to cross reference deeper definitions,” notes Mason.

The overall offering is aimed at giving users the transparency and understanding of the data they use across multiple formats, providers or naming conventions, he says. This is in lieu of greater standardisation, which may be attainable in the long term, and part of the day to day requirements of dealing with a “multi-faceted” fragmented environment.

Mason is sceptical that the regulatory community will be able to drive standardisation overnight and this is why the vendor has focused on the meta layer of cross referencing data. Initiatives such as the National Institute of Finance (NIF) in the US and the European Central Bank’s (ECB) reference data utility may encourage some degree of standardisation in the long term, but this will be a very gradual process, he adds.

“First people need to get transparency into what they have got already, rather than leap to a one size fits all standard, which I think the industry would not be able to cope with at the moment,” he contends. “I’d let Darwin take care of the standardisation process. As technology gets refreshed, so people can move to a more standardised set of codes. There are too many systems that use embedded codes such as RIC codes for the market to be able to move to a single new standard.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Strategies and solutions for unlocking value from unstructured data

27 March 2025 11:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Unstructured data accounts for a growing proportion of the information that capital markets participants are using in their day-to-day operations. Technology – especially generative artificial intelligence (GenAI) – is enabling organisations to prise crucial insights from sources – such as social...

BLOG

Why Artificial Intelligence isn’t Always a Clever Solution

James Maxfield, Chief Product Officer at Duco. It’s an exciting time for technology. There are some seriously powerful AI tools on the market, solving real issues that capital markets firms have been battling with for years, if not decades. Undoubtedly, this is just the start. Yet at the same time there’s a noticeable trend to...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...