The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

SmartStream’s DClear Utilities Launches On Demand Data Dictionary Portal

Share article

As part of its wider bid to take on the reference data utility space, SmartStream Technologies has this week launched an on demand portal for access to verified and cleansed exchange and data provider documentation. John Mason, CEO of DClear, explains to Reference Data Review that although the Data Dictionary will form an integral part of the vendor’s overall Reference Data Utility, it is now available as a standalone structured information portal solution for those seeking to standardise, quantify and accelerate the translation of documentation.

SmartStream’s DClear business has been working on the solution for around six months, explains Mason, who was appointed as CEO of the vendor’s utility division back in September last year. According to DClear, the new offering simplifies the process of integrating new data providers and feeds into the client’s environment and improves data quality by reviewing, researching and verifying vendor feed documentation to ensure complete accuracy.

It was not the length of time that took to develop the solution that matters, notes Mason, as the Data Dictionary will be an ongoing project and the vendor will continually add to its content. “The basic premise is an online portal where data managers and developers can go to be made aware of the feed definitions from providers in the market, be they exchanges or data vendors. At the moment that is done by manually searching for and sourcing PDFs and documents for this data,” he explains.

The solution will therefore act as a portal where that information is available as a reference and will allow firms to begin cross referencing identifiers from the feeds of various providers. “For example, one provider may call a certain feed a best bid offer, while another might call it best offer. It enables you to say that Thomson Reuters calls it A, while Interactive Data calls it C, so that you have some commonality across your providers,” elaborates Mason. Ultimately, this will allow firms to map this feed data to their internal company data, he adds.

DClear indicates that the information will be provided in logical categories and users will not be required to do any front end integration work, as the data is viewable through a web interface. The web front end has become increasingly popular of late within the vendor community, as these providers seek to offer the lowest total cost of ownership for their solutions.

There are currently six tier one sell side institutions trialling the solution at the moment, says Mason, and he is hopeful that the new on demand approach will prove popular in the current environment.

Mason reckons the solution should provide real savings overall as it removes the need to build and maintain an in-house data dictionary and the related staffing costs therein. The DClear offering also removes the operational overheads of the cost of upkeep in collating, verifying and maintaining these data definitions across providers. “DClear’s utility approach removes this cost and uncertainty around staffing by ensuring definition information is always available, with clients only paying when they need to use the information. As a result, firms not only receive better quality data and achieve their global coverage requirements but do so at a fraction of the cost compared to managing it internally,” he claims.

He notes that it is in keeping with the current trend towards a “cloud approach” to data access, by enabling a quicker and easier way to store and retrieve data. “You can access it by paying for an hour’s access on your credit card,” explains Mason. “This is a real, true on demand approach to this space.” However, the vendor also offers a “more substantive” subscription service for those that wish to receive cross referencing and mapping to internal systems for this data.

As well as vendor and exchange data feeds, the solution will also include mapping to the work of industry bodies such as the EDM Council’s semantics repository. “That is one of the areas we are looking at to improve cross referencing capabilities,” says Mason. “Technical terms to business terms and in-house terms to business terms.”

Cross referencing has become increasingly important as the world has become more fragmented, he contends. There are multiple identifiers across the market and if people want to get a full picture of where liquidity resides, they need to see how items have been defined.

The solution fits into the vendor’s overall utility model, which Reference Data Review first exclusively covered back in May 2008, when it was part of the Dubai International Financial Centre (DIFC) and before the DIFC’s acquisition of SmartStream. “We have a number of tools out there including a cross referencing symbology tool and this data dictionary that takes us into another layer, which means we are able to cross reference deeper definitions,” notes Mason.

The overall offering is aimed at giving users the transparency and understanding of the data they use across multiple formats, providers or naming conventions, he says. This is in lieu of greater standardisation, which may be attainable in the long term, and part of the day to day requirements of dealing with a “multi-faceted” fragmented environment.

Mason is sceptical that the regulatory community will be able to drive standardisation overnight and this is why the vendor has focused on the meta layer of cross referencing data. Initiatives such as the National Institute of Finance (NIF) in the US and the European Central Bank’s (ECB) reference data utility may encourage some degree of standardisation in the long term, but this will be a very gradual process, he adds.

“First people need to get transparency into what they have got already, rather than leap to a one size fits all standard, which I think the industry would not be able to cope with at the moment,” he contends. “I’d let Darwin take care of the standardisation process. As technology gets refreshed, so people can move to a more standardised set of codes. There are too many systems that use embedded codes such as RIC codes for the market to be able to move to a single new standard.”

Related content

WEBINAR

Recorded Webinar: Adopting Entity Data Hierarchies to Address Holistic Risk Management

Firms across the board are struggling to gain a comprehensive view of their counterparty risk. In the wake of the Credit Crisis, regulators have increased their focus on pushing firms to not only better understand risk exposure, but also be able to provide evidence of the analysis they use to create their view of risk....

BLOG

DMS US Virtual Goes Live with a Practitioner Innovation Keynote and Real-Time Q&As

A-Team Group’s Data Management Summit USA Virtual kicked off today with a hugely insightful live practitioner innovation keynote followed by two live, and lively, Q&A sessions packed with audience questions and answered by the day’s expert keynote and panel speakers. Andrew Delaney, president and chief content officer at A-Team, hosted today’s live sessions of the...

EVENT

Data Management Summit USA Virtual

The highly successful Data Management Summit USA Virtual was held in September 2020 and explored how sell side and buy side financial institutions are navigating the global crisis and adapting their data strategies to manage in today’s new normal environment.

GUIDE

Corporate Actions 2009 Edition

Rather than detracting attention away from corporate actions automation projects, the financial crisis appears to have accentuated the importance of the vital nature of this data. Financial institutions are more aware than ever before of the impact that inaccurate corporate actions data has on their bottom lines as a result of the increased focus on...