The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

SmartStream’s DClear Utilities Launches On Demand Data Dictionary Portal

As part of its wider bid to take on the reference data utility space, SmartStream Technologies has this week launched an on demand portal for access to verified and cleansed exchange and data provider documentation. John Mason, CEO of DClear, explains to Reference Data Review that although the Data Dictionary will form an integral part of the vendor’s overall Reference Data Utility, it is now available as a standalone structured information portal solution for those seeking to standardise, quantify and accelerate the translation of documentation.

SmartStream’s DClear business has been working on the solution for around six months, explains Mason, who was appointed as CEO of the vendor’s utility division back in September last year. According to DClear, the new offering simplifies the process of integrating new data providers and feeds into the client’s environment and improves data quality by reviewing, researching and verifying vendor feed documentation to ensure complete accuracy.

It was not the length of time that took to develop the solution that matters, notes Mason, as the Data Dictionary will be an ongoing project and the vendor will continually add to its content. “The basic premise is an online portal where data managers and developers can go to be made aware of the feed definitions from providers in the market, be they exchanges or data vendors. At the moment that is done by manually searching for and sourcing PDFs and documents for this data,” he explains.

The solution will therefore act as a portal where that information is available as a reference and will allow firms to begin cross referencing identifiers from the feeds of various providers. “For example, one provider may call a certain feed a best bid offer, while another might call it best offer. It enables you to say that Thomson Reuters calls it A, while Interactive Data calls it C, so that you have some commonality across your providers,” elaborates Mason. Ultimately, this will allow firms to map this feed data to their internal company data, he adds.

DClear indicates that the information will be provided in logical categories and users will not be required to do any front end integration work, as the data is viewable through a web interface. The web front end has become increasingly popular of late within the vendor community, as these providers seek to offer the lowest total cost of ownership for their solutions.

There are currently six tier one sell side institutions trialling the solution at the moment, says Mason, and he is hopeful that the new on demand approach will prove popular in the current environment.

Mason reckons the solution should provide real savings overall as it removes the need to build and maintain an in-house data dictionary and the related staffing costs therein. The DClear offering also removes the operational overheads of the cost of upkeep in collating, verifying and maintaining these data definitions across providers. “DClear’s utility approach removes this cost and uncertainty around staffing by ensuring definition information is always available, with clients only paying when they need to use the information. As a result, firms not only receive better quality data and achieve their global coverage requirements but do so at a fraction of the cost compared to managing it internally,” he claims.

He notes that it is in keeping with the current trend towards a “cloud approach” to data access, by enabling a quicker and easier way to store and retrieve data. “You can access it by paying for an hour’s access on your credit card,” explains Mason. “This is a real, true on demand approach to this space.” However, the vendor also offers a “more substantive” subscription service for those that wish to receive cross referencing and mapping to internal systems for this data.

As well as vendor and exchange data feeds, the solution will also include mapping to the work of industry bodies such as the EDM Council’s semantics repository. “That is one of the areas we are looking at to improve cross referencing capabilities,” says Mason. “Technical terms to business terms and in-house terms to business terms.”

Cross referencing has become increasingly important as the world has become more fragmented, he contends. There are multiple identifiers across the market and if people want to get a full picture of where liquidity resides, they need to see how items have been defined.

The solution fits into the vendor’s overall utility model, which Reference Data Review first exclusively covered back in May 2008, when it was part of the Dubai International Financial Centre (DIFC) and before the DIFC’s acquisition of SmartStream. “We have a number of tools out there including a cross referencing symbology tool and this data dictionary that takes us into another layer, which means we are able to cross reference deeper definitions,” notes Mason.

The overall offering is aimed at giving users the transparency and understanding of the data they use across multiple formats, providers or naming conventions, he says. This is in lieu of greater standardisation, which may be attainable in the long term, and part of the day to day requirements of dealing with a “multi-faceted” fragmented environment.

Mason is sceptical that the regulatory community will be able to drive standardisation overnight and this is why the vendor has focused on the meta layer of cross referencing data. Initiatives such as the National Institute of Finance (NIF) in the US and the European Central Bank’s (ECB) reference data utility may encourage some degree of standardisation in the long term, but this will be a very gradual process, he adds.

“First people need to get transparency into what they have got already, rather than leap to a one size fits all standard, which I think the industry would not be able to cope with at the moment,” he contends. “I’d let Darwin take care of the standardisation process. As technology gets refreshed, so people can move to a more standardised set of codes. There are too many systems that use embedded codes such as RIC codes for the market to be able to move to a single new standard.”

Related content

WEBINAR

Recorded Webinar: Evolution of data management for the buy-side 2021

The buy-side faced a barrage of regulation in 2020 and is now under pressure to make post-Brexit adjustments and complete LIBOR transition by the end of 2021. To ensure compliance and ease the burden of in-house data management, many firms turned to outsourcing and managed services. But there is more to come, as buy-side firms...

BLOG

HUB Cloud Platform Aims to Transform Asset Management Operating Model

The HUB technology company set up by PIMCO, Man Group, IHS Markit, State Street, Microsoft and McKinsey & Company promises a utility style cloud-based platform designed to transform asset managers’ middle and back office operating processes, and accelerate them towards a digital operating model. This should reduce cost and mitigate risks, and leave asset managers...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...