About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

SmartStream’s DClear Utilities Launches On Demand Data Dictionary Portal

Subscribe to our newsletter

As part of its wider bid to take on the reference data utility space, SmartStream Technologies has this week launched an on demand portal for access to verified and cleansed exchange and data provider documentation. John Mason, CEO of DClear, explains to Reference Data Review that although the Data Dictionary will form an integral part of the vendor’s overall Reference Data Utility, it is now available as a standalone structured information portal solution for those seeking to standardise, quantify and accelerate the translation of documentation.

SmartStream’s DClear business has been working on the solution for around six months, explains Mason, who was appointed as CEO of the vendor’s utility division back in September last year. According to DClear, the new offering simplifies the process of integrating new data providers and feeds into the client’s environment and improves data quality by reviewing, researching and verifying vendor feed documentation to ensure complete accuracy.

It was not the length of time that took to develop the solution that matters, notes Mason, as the Data Dictionary will be an ongoing project and the vendor will continually add to its content. “The basic premise is an online portal where data managers and developers can go to be made aware of the feed definitions from providers in the market, be they exchanges or data vendors. At the moment that is done by manually searching for and sourcing PDFs and documents for this data,” he explains.

The solution will therefore act as a portal where that information is available as a reference and will allow firms to begin cross referencing identifiers from the feeds of various providers. “For example, one provider may call a certain feed a best bid offer, while another might call it best offer. It enables you to say that Thomson Reuters calls it A, while Interactive Data calls it C, so that you have some commonality across your providers,” elaborates Mason. Ultimately, this will allow firms to map this feed data to their internal company data, he adds.

DClear indicates that the information will be provided in logical categories and users will not be required to do any front end integration work, as the data is viewable through a web interface. The web front end has become increasingly popular of late within the vendor community, as these providers seek to offer the lowest total cost of ownership for their solutions.

There are currently six tier one sell side institutions trialling the solution at the moment, says Mason, and he is hopeful that the new on demand approach will prove popular in the current environment.

Mason reckons the solution should provide real savings overall as it removes the need to build and maintain an in-house data dictionary and the related staffing costs therein. The DClear offering also removes the operational overheads of the cost of upkeep in collating, verifying and maintaining these data definitions across providers. “DClear’s utility approach removes this cost and uncertainty around staffing by ensuring definition information is always available, with clients only paying when they need to use the information. As a result, firms not only receive better quality data and achieve their global coverage requirements but do so at a fraction of the cost compared to managing it internally,” he claims.

He notes that it is in keeping with the current trend towards a “cloud approach” to data access, by enabling a quicker and easier way to store and retrieve data. “You can access it by paying for an hour’s access on your credit card,” explains Mason. “This is a real, true on demand approach to this space.” However, the vendor also offers a “more substantive” subscription service for those that wish to receive cross referencing and mapping to internal systems for this data.

As well as vendor and exchange data feeds, the solution will also include mapping to the work of industry bodies such as the EDM Council’s semantics repository. “That is one of the areas we are looking at to improve cross referencing capabilities,” says Mason. “Technical terms to business terms and in-house terms to business terms.”

Cross referencing has become increasingly important as the world has become more fragmented, he contends. There are multiple identifiers across the market and if people want to get a full picture of where liquidity resides, they need to see how items have been defined.

The solution fits into the vendor’s overall utility model, which Reference Data Review first exclusively covered back in May 2008, when it was part of the Dubai International Financial Centre (DIFC) and before the DIFC’s acquisition of SmartStream. “We have a number of tools out there including a cross referencing symbology tool and this data dictionary that takes us into another layer, which means we are able to cross reference deeper definitions,” notes Mason.

The overall offering is aimed at giving users the transparency and understanding of the data they use across multiple formats, providers or naming conventions, he says. This is in lieu of greater standardisation, which may be attainable in the long term, and part of the day to day requirements of dealing with a “multi-faceted” fragmented environment.

Mason is sceptical that the regulatory community will be able to drive standardisation overnight and this is why the vendor has focused on the meta layer of cross referencing data. Initiatives such as the National Institute of Finance (NIF) in the US and the European Central Bank’s (ECB) reference data utility may encourage some degree of standardisation in the long term, but this will be a very gradual process, he adds.

“First people need to get transparency into what they have got already, rather than leap to a one size fits all standard, which I think the industry would not be able to cope with at the moment,” he contends. “I’d let Darwin take care of the standardisation process. As technology gets refreshed, so people can move to a more standardised set of codes. There are too many systems that use embedded codes such as RIC codes for the market to be able to move to a single new standard.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to integrating legacy data with the cloud

Acceleration of cloud adoption, increasing demand for digital transformation and real-time data management have led financial institutions to rethink their data infrastructure to enable more agile operating models that can respond faster to change and make data a competitive advantage. For many, integrating data from legacy systems and data across the business landscape with a...

BLOG

Stephan Wolf Steps Down from Role of GLEIF CEO in June 2024

Stephan Wolf, CEO of the Global Legal Entity Identifier Foundation (GLEIF), will step down from the role on 24 June 2024 after a decade of leading the foundation from its start-up phase to the growing organisation it is today. In a post on LinkedIn, Wolf writes: “After a decade of incredible experiences and achievements, I...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...